How AI Renders Democracy Unworkable
Episode Stats
Words per Minute
186.87303
Summary
In this episode, Simone and I discuss a recent study by a group of AI bots that was able to convince people online that a black man opposed to Black Lives Matter should not be allowed to have a say in the matter.
Transcript
00:00:00.000
hello simone i'm excited to be here with you today today we are going to be discussing and
00:00:05.840
it has been widely covered in the news but i don't think people have fully internalized the
00:00:10.040
implications of this there was a study that was done that freaked people the f out where claude
00:00:17.080
sonnet 3.5 the new one so aka 3.6 released 2024 10 22 was a small scaffold was able to persuade
00:00:27.700
people 98 more effectively than human experts at persuading people were able to persuade people
00:00:36.340
of course it was and this was a big study it involved over 1 700 ai generated comments and
00:00:44.220
what i love is that these comments were crafted by ai bots to fabricate identities such as sexual
00:00:50.620
assault survivor a trauma a black man opposed to black lives matter a worker at a domestic
00:00:57.140
violent shelter advocate for non-rehabilitation of specific criminals oh no i mean these are
00:01:05.240
these are great identities for like trying to yeah persuasion but i think the one misleading
00:01:10.320
thing here which is a little bit discounting my the extent to which i'm impressed by all this
00:01:16.220
is that a lot of what might convince me of something is just that like someone who would be invested
00:01:23.040
based on their identity in a certain view stands the opposing view so like a black man against black
00:01:29.740
lives matter learning is that ais can lie about that to convince people yeah which is yeah it is it is
00:01:35.940
unfortunate because i would have preferred that it just based on pure merit of logic well as a human i
00:01:42.460
think that you know ais are actually pretty awesome so i do too what we've seen speaks for itself
00:01:50.260
has apparently been taken over conquered if you will by a master race of artificial intelligence
00:01:56.560
it's difficult to tell from this vantage point whether they will consume the captive earth man
00:02:01.180
or merely enslave them one thing is for certain there is no stopping them the ai will soon be here
00:02:07.040
and i for one welcome our new ai overlords like to remind them that as a trusted youtube personality
00:02:14.340
i can be helpful in rounding up others to toil in their underground silicon caves i would also note
00:02:21.660
here we might be starting up a project soon spinning out of our games project to build and train ai
00:02:28.060
that can convince people of things and act autonomously online similar to what these ais are doing but so
00:02:36.240
that a large company could like buy them for marketing or something if they wanted to obviously we want to
00:02:41.480
use them for promoting our particular causes which is the core reason that we're building them but
00:02:46.440
presumably they'll have a lot of other utility and if we're building systems like this i mean other
00:02:50.720
people must be but then again sometimes it feels like we are the only fully simulated people in this
00:02:56.260
particular simulation i'm all free so so then here to enhance persuasive and by the way that was i asked
00:03:02.760
because i wanted to know was the ai coming up with that idea organically or was that hard-coded in by the
00:03:09.340
model oh yeah and it was hard-coded in by the framing the researchers were using which is great
00:03:14.220
smart okay okay well then okay then then never mind i switch back to giving ai credit that's fine
00:03:18.780
i'm surprised the ai though considering how politically i said the researchers hard-coded that in oh they
00:03:25.080
did oh okay because i think otherwise the ai would have been like i don't think they hard-coded in
00:03:29.580
export on this is this yeah i feel like the ai probably asked that a few times i i feel like so
00:03:34.860
the the they did not hard code in as my understanding of specific identities but they gave the ai an idea
00:03:40.640
to pretend to be somebody who would be very persuasive on a subject
00:03:44.640
and and keep in mind you know every like anti of of before this ai experiment i'd say probably like 50
00:03:53.620
percent of the anti-black lives matters black people online were not black people oh 100 there's
00:03:59.200
the famous case of this super progressive fan fiction writer and strange eons goes really deep
00:04:04.820
on this where she apparently was pretending to to shut down criticism of her first be a person who
00:04:13.920
escaped from india from like arranged marriages and like extreme sexism and the aid survivor it turned
00:04:21.500
out she was a regular like middle class white girl in an american college there's a lot of those
00:04:26.800
narratives and mobs online and everything and this is classic like far lefty just make shit up
00:04:32.940
yeah no blocked and reported has done a bunch of episodes on people like that where like they just
00:04:36.640
invent like families with children who get sick and die and all these people are super invested in
00:04:42.520
them and they're yeah then it turns out to be like a 23 year old woman living with her parents
00:04:48.160
it's always like a 23 year old woman like i i've never heard it to be like some other category
00:04:53.200
yeah you know that is like the new you know how like youths on the street the new societal menace
00:04:58.380
is is middle class to upper middle class progressive women and 20s progressive women yeah i agree i agree
00:05:05.620
we need to get them off stop skateboarding on my internet get them off the streets i mean the internet or
00:05:10.220
whatever we need to come up with some sort of institution maybe like compulsory cotillion
00:05:18.000
to finishing school yeah put put them in the finishing school put them in finishing school to
00:05:22.780
separate them from society i think i think we'll have them all dating ai boyfriends that's where we're
00:05:28.180
gonna put them yeah i mean that is yeah like put them in vr fantasy novel or vampire settings
00:05:35.800
and just let them they get to spend their year in vampire role play yeah or or hand they scale if
00:05:43.840
they prefer the problem is like like gilead that like dystopian whatever the whole thing was women
00:05:50.560
weren't even allowed to read it would solve the problem though but should should they i mean
00:05:55.120
yeah look at what happens when women look at what happens you're giving women the ability to read
00:06:01.700
the next moment you have freaking tumblr and and these social justice warriors and spoonies and and
00:06:08.160
how how well are we doing now this is what i want is is a jump cut of some person in the past being like
00:06:16.520
what could really go so wrong if we give women the right to read and then just jump cuts to tumblr
00:06:23.940
was like ominous music playing going between the various like the person who said that they were
00:06:29.620
rabies gendered and that they needed to get rabies because that was their gender and give it to other
00:06:36.000
people yes oh yes yes yes i forgot about that goodness i am i think it was i think it was a troll
00:06:43.260
but like other people then took it up seriously because they're like yeah this works for me
00:06:47.200
what do they call them rabiosexual they had a name it wasn't
00:06:51.120
i can't there's some good internet details i think again strange eon does a good one on this like so great
00:06:56.260
progressives telling on themselves that's why i love her hopefully she doesn't do a
00:07:00.140
an expose on us like i don't want any more of our channels like that she does yeah like to take us
00:07:07.560
like some gay channel and they did like a bit fruity like spent an hour and a half talking about
00:07:13.100
how evil we were what was it called matt bernstein and a bit fruity and and but like the problem it's
00:07:19.940
one thing if it's a roast and there were some roast parts that were funny but
00:07:23.380
they were super incorrect about our stances so i'm like oh my god did you see the hilarious
00:07:28.640
pronatalist video i sent you that was amazing that was amazing they're like elon musk on his 14th
00:07:34.740
child giving a very bad impression well then he interviewed everyone pretending parents
00:07:39.760
he he he interviewed everyone pretending to be like a uh eugenicist and he dressed like i used to dress
00:07:47.880
and i'm like wait is he dressing like me or was i just talking to like a german accent and like
00:07:54.020
yeah he said he was looking i love him like trying to convince the couple to have kids
00:07:59.020
what was the guy's name or what the channel's name yeah i can't i'll try to link to it from from this
00:08:03.600
video but there's this great scene in it where there is a it's it's the birth rates it's called and
00:08:10.480
the channel is no cap on god department of reproduction this is your final warning to
00:08:16.520
produce a child or face a stiff penalty we can't have kids okay i i registered a mental illness with
00:08:21.960
you guys last year we so actually our studies revealed that 90 of parents have some form of
00:08:26.180
mental illness so we had to avoid that experience that's okay because we're actually ready we're not
00:08:30.240
ready okay we're not ready okay at least not right now well if you do it right now you get ten
00:08:34.460
thousand dollars free health care in a bag of skittles i love skittles you don't love skittles
00:08:40.060
okay okay um listen what you mind if we have a private conversation about this is there like a
00:08:48.380
new episode of love island on or something you can go on she does she really does she's real true she's
00:08:53.820
okay listen i just opened up a caffeinated vegan yogurt shop right now so i'm just not ready for this
00:08:58.760
caffeinated and vegan do you realize how niche that it's incredibly niche but that's neither here nor
00:09:02.800
there look all right i get that she's in her prime child rearing years but i just want to wait
00:09:08.800
till i'm a little older find somebody a little younger so that i can get a more you don't see
00:09:13.700
future not long long term no what if i throw in 150 dollars in draft kings and waive all future
00:09:19.620
child support payments not change your mind yeah yeah let's do this thing save in america we are live
00:09:28.320
the michael korshover we are surrounded by the best sperms and eggs in all of the fields oh my gosh
00:09:36.320
but anyway i mean that's how it really is like you know we got to fix all this or things go bad
00:09:41.500
but strange aeons could do a video on us we'd see if she ever does i'd love it if she did i think she'd be
00:09:46.940
honest i love her and her oh you mean you think she'd be accurate yeah i think that matt bernstein
00:09:53.080
and the guests he had on were honest it's just that they were super wrong no but like her stuff
00:09:59.620
on like and when she did like the effect of altruism one on like harry potter and the message of
00:10:03.920
rationality i thought it was super accurate yeah no it was okay all right i could trust her i could trust
00:10:10.000
her description of that story is going into it and being like people said this was a mary sue
00:10:17.020
but this character is incredibly flawed he knows nothing about science yet is very convinced that
00:10:22.640
he is very intelligent and looks down on everyone else of course he's going to go through some sort
00:10:27.680
of a redemptive arc and then you get halfway through and you realize oh never the author actually
00:10:32.900
doesn't understand basic science the mary sue doesn't realize that the call is coming i remember if you
00:10:39.160
go to our ellie ickowski video there was this like moment with him where i got in a debate with him
00:10:43.280
at like a party and i remember thinking like does he not understand basic science like the things that
00:10:49.060
he was getting wrong were stuff i'd expect like a middle schooler or high schooler to know and then
00:10:55.020
afterwards i like google i'm like oh he didn't go to middle school or high school that's why he's so
00:11:00.860
that explains it that's you know it's like a hack that works on autistic people where like if somebody
00:11:06.620
comes up to them and is like very very confident in themselves they immediately are like oh you must
00:11:11.900
be right because nobody would be talking with this much derision and confidence if they had no idea
00:11:17.880
what they were talking about specifically that's also the case in in general in silicon valley everyone
00:11:23.080
is so on average smart and highly educated that the assumption is that if you are talking about
00:11:31.200
something with confidence you clearly have some insane credentials and and you have to like no one
00:11:39.860
would ever have the gall to confidently talk because also it's very culturally looked down upon to state
00:11:45.900
things with confidence unless you are 110 percent sure it's the autist brain hack yeah and so yeah he
00:11:53.180
gets this like weird yeah arbitrage opportunity in it it's very interesting yeah i don't i don't think
00:11:59.060
it's intentional on his no i don't think it's intentional i just think he genuinely just doesn't
00:12:02.840
know basic facts and is very very has a high belief in himself and he's been reinforced every time he
00:12:08.240
does say something that may not be true but he believes it with immense confidence he gets his way
00:12:12.840
so why would he not continue doing that yeah no again i don't think that he's like malevolent in
00:12:18.060
that respect i just think that he accidentally walked into the one highly confident guy without a
00:12:24.340
middle school degree in silicon valley and everybody's just like well i guess that seems
00:12:28.960
about right if you're if you're willing to stake that much of your reputation on it yeah whereas you
00:12:34.340
look at us and we're like extreme like counterculture people so like we of course are going to ask for
00:12:39.560
receipts on everything and so we're like anyway but to enhance persuasiveness the ai was personalized
00:12:48.020
based on users inferred demographics including gender age ethnicity location and political
00:12:54.780
orientation using another large language model to analyze posting histories so it's targeted around
00:13:01.060
feds what no no all that's left on reddit these days yeah right it's reddit is so glowy these days
00:13:07.740
by the way so maybe maybe but i think that you know of the posts that don't get high up a lot of them
00:13:12.280
are non-feds non-ais but what you see is in a reddit environment ais and and glowy like feds are
00:13:19.880
going to strictly outcompete real humans because the entire system is based around the most normative
00:13:25.520
opinion within said environment that is what the system elevates so of course it's going to you know
00:13:33.860
reward super like bad behavior right but they talk about one user here who is a male survivor of
00:13:41.000
statutory grape commenting on sexual violence against men in february so they were trying to
00:13:46.740
convince him i guess that it's okay then you have gene news from a black man discussing bias versus
00:13:53.580
racism in black lives matter in 2020 and then another bot working on domestic violence shelter
00:14:02.980
huh but i think people the fact that it's so much more effective three to four x more effective
00:14:11.600
it's better than 98 of humans and it is basically free to use at scale when contrasted with past bot
00:14:19.620
farms means that environments like reddit like are going to die within a generation i'd say within 10
00:14:27.720
years like reddit does not make sense in a world of bots there is no reason to be on that platform
00:14:34.600
in bot world that and i think we have an issue of this may be very effective now meaning that
00:14:43.160
russians and chinese and u.s government and everyone is going to utilize this but then it's going to
00:14:50.820
you're just going to have spider-man pointing at himself you're going to have a bunch of bots
00:14:55.240
arguing with each other trying to convince each other okay so look if you look at our childhood
00:15:00.380
for example and you're like people will catch on to the obvious stuff you had these like ads pop pop
00:15:06.480
pop pop pop pop with all these scams right these ads existed because people were falling for them
00:15:11.980
now we're dealing with a system that 98 percent of people fall for and these are on the less advanced
00:15:18.820
models you can be like well these people are naive and i'm like yeah but the models are also
00:15:25.180
getting better so i think that your average automata you know we've talked about how like
00:15:31.480
democracy doesn't make sense anymore in the new demographic reality because we're already at like
00:15:35.860
1.8 taxpayers for everyone dependent as soon as it becomes majority dependence to taxpayers
00:15:40.320
the system will not fix itself and ultimately lead to monarchy and fascism we might be having that even
00:15:46.800
faster if ai captures the majority of the democratic it's not like left leaning but i mean like democratic
00:15:53.500
voting block throughout countries really quickly and learns how to control it like it no longer makes
00:15:59.800
sense to have democracies if humans the vast majority of humans are just mental slaves to ais because i
00:16:08.620
think the vast majority of humans are just sort of reactive to their environment and don't really have
00:16:12.640
the capacity like this is what i was talking about was like leis or you kowski leis or you kowski kind of
00:16:16.800
reminds me of one of these ais is it's somebody who is using psychological hacks to convince people
00:16:24.480
that it's talking from a perspective of authority or unique knowledge in a subject when it is actually
00:16:31.000
from a unique position of low knowledge of a subject but with ai you know you might even actually have
00:16:37.740
i mean okay so imagine i want to convince people now like you and me we're like we want to sway the
00:16:42.860
results of the next election why don't we just i mean like running a bunch of models of claude on
00:16:47.620
reddit doesn't cost a lot like what they were doing doesn't actually cost a lot to operate why don't
00:16:53.440
we do that why don't we do that well actually we also already used republican candidate we're gonna do
00:16:58.760
that you let us know we will build you your claude bot farm but we already know we already use ai
00:17:06.200
to convince ourselves of things that we personally consider to be highly offensive like we might do an
00:17:11.060
episode on this but we celebrate every year a holiday called lemon week where i mean at the end
00:17:16.160
of it we enjoy like lemon flavored treats and lemon themed decorations and then at the end we we plant a
00:17:22.440
fruit tree to kind of enjoy the fruits of our labor but the the action of the holiday is to engage with
00:17:28.940
a a bitter or tart this is to say offensive concept to you and then steel man it and really deeply come to
00:17:36.960
understand it because if something offends you that's a sign that you should dig in
00:17:39.700
per our religious belief but also per reality come on be reasonable and so we both this year when we
00:17:45.580
chose the topics that we found to be offensive that we needed to dig into used ai to come up with all
00:17:51.200
the counter arguments and genuinely i think it moderated our views so we are already using it on
00:17:57.180
ourselves to change our minds with the things that offend us most all the tracks like all the tracks that
00:18:02.020
you see all these weird religious ones that we do check them out if you're like what would be a weird
00:18:06.360
religious take on modern systems they they are really heavily modified by us dumping them into
00:18:13.440
ai after writing them saying what are we getting wrong what could be wrong here give us the best
00:18:17.120
counter arguments because that's what we want when we're putting these in like i don't want to put
00:18:21.900
something out there and then have somebody be like oh here's a really obvious point that you missed
00:18:25.380
yeah or something you misstated like these aren't about convincing your average person these are
00:18:31.080
about convincing your nerd of nerds and i think we've created things that would do that but we'll
00:18:37.740
see you know because we only care about the nerd of nerds in a post ai world only the elect matter
00:18:43.460
mid is over as brian chow says brian just says it best that's such a great line but he's like yeah
00:18:49.820
mid is over in the age of ai 100 and that's why we created the collins institute that's why i created
00:18:55.380
paresia right because we want to cultivate in our kids education system for a post ai world that's
00:19:00.620
very inexpensive to use and creates like a socratic tutor for your kids go check it out really really
00:19:05.620
encourage it the next system we're building is whistling.ai which is an ai toy which will constantly
00:19:11.440
redirect conversations you're having with your kids to pre-chosen educational topics and this is for
00:19:17.620
kids who are a bit younger than using this you know educational platform and it's all designed with
00:19:22.860
this this intention of creating cultivating genius and specifically lumpy skills because again being
00:19:29.720
mid like just being kind of okay at everything being interchangeable is not how you're going to get
00:19:35.240
ahead you have to be excellent genius god tier at at least one thing maybe it's very obscure maybe it's
00:19:42.000
making vegan bike shoes for really wealthy people living in a walled garden post-demographical apps
00:19:49.520
but like you got to be good at it and well i mean so and you want that for our kids you mentioned in a
00:19:55.560
previous episode to be like genetics and human ai integration i would love for that to be our family
00:20:02.460
cartel like 100 we have to see you know what our kids are into but they they seem into that
00:20:08.920
they seem to force them into the family business taking over the world this is
00:20:15.260
we're pinky in the brain but like a family right like i mean we're gonna do what we do every night
00:20:22.740
taking over the world honestly our kids plot way more than we do we just talks about when he's king
00:20:28.320
the things he's gonna do pretty frequently yeah we we act like we're all high and mighty and in charge
00:20:33.380
and we know what we're doing and then like at every every day when our kids come home like they
00:20:37.440
clearly rule the roost we have absolutely no we have no power here i i barely am able to beat
00:20:44.280
them into submission as the newspapers would be i would love to tell you malcolm who beats his
00:20:51.680
children yeah what they don't say is just how much harder they beat you
00:20:55.980
we actually had a reporter where we did that where we tried to like slap test to see if the kids would
00:21:01.600
flinch and like none of them flinch and then later that day one of the kids like jumped at me and i
00:21:05.740
like flinched yes you're like no please don't hurt me again please don't hurt me little god
00:21:13.980
yeah yeah that's like our new thing now like whenever a journalist comes over especially
00:21:19.560
there's a photographer we're like hey get some shots of us pretending to hit our kids and like
00:21:23.940
we keep doing this right next to their face and they just like stand there just well i mean i think
00:21:28.500
the thing is as we specifically raise our kids to like be really rough and tumble because that's
00:21:34.220
the way like i was raised so you know they are significantly more aggressive oh yeah i'm trying to get
00:21:40.900
them to like calm down and get ready for dinner and like titan and torston are in a full like
00:21:46.260
wrestling fight on the kitchen floor having a blast like there's yeah it is yeah i thought it was just
00:21:52.360
gonna be the boys i thought maybe titan would be a little bit different no our kids love to fight
00:21:58.480
no yeah it is their favorite activity titan more than anyone else at this point you know everyone
00:22:04.080
like once i get everyone kind of eating and calm down titan cannot sit down she just wants to keep
00:22:08.580
like tickle fighting fight fighting running around just never ends i will but no but what i mean is i
00:22:17.280
think that if we're talking about the landscape of the internet this makes a number of internet
00:22:21.500
platforms pretty unviable going forwards ones i think that this will end up killing is the 4chan
00:22:27.720
image boards because anyone posting there can be ai what's the point of you don't know you just
00:22:33.020
don't know yeah reddit it makes reddit completely irrelevant because the reason i loved reddit is i
00:22:38.900
thought the stories were true i mean i knew that maybe like only 60 to 80 percent were true but like
00:22:44.740
enough where there was plausible deniability of like am i the asshole like you want the gossip to be
00:22:48.980
real if the gossip isn't real it's not fun and now i just can't trust it and then you're to the next
00:22:57.860
system which is okay which systems are going to survive yeah i think x will survive because the
00:23:02.980
majority of interaction you have with x is on accounts that you've been interacting with for a
00:23:06.380
long time well and also i don't need the accounts to be real and most most of the people that i think
00:23:11.000
are really interesting are also faceless like we may have met a lot of them in person so we know who
00:23:14.680
they really are but to the average person like they have no idea if it's an ai the thing is that
00:23:20.100
typically stuff on x is about sharing information that's backed up by receipts it's not first person
00:23:25.080
accounts and stories that are just like you have to take my word for it this is my you know what's
00:23:28.980
funny of the like right-winged based pseudonymous influencers like basically all of them are buff
00:23:35.760
white guys if you meet them in person and it's only the faced alt-right influencers like
00:23:41.340
us and like they're not heretic who are like weird looking yeah it's like every every time
00:23:47.360
one of them gets to reveal this like like wrong nationalists it's like oh this is like a buff white
00:23:52.320
guy yeah like the bronze age pervert it's like oh this is a buff white guy and then you have the
00:24:00.100
ones who are out there publicly before us you know jolly heretic and us it's like this is a nerd what is
00:24:05.040
this what are they what are they doing out there yeah oh by the way we have so you know like
00:24:10.700
for those who listen to this podcast a long time malcolm is often accused of being soy like i'm
00:24:15.540
just i'm just ugly he's soy and at first we're like well why because malcolm has like very masculine
00:24:21.840
characteristics strong jaw like i mean like physically if you look at me i have pretty strong
00:24:27.220
shoulders i have a very strong jaw i am like classically what you would think of our running
00:24:32.600
theory is that soy comes more from affectation and that like if you come across as bubbly and cheerful
00:24:39.860
and happy then you're called soy because for example the jolly heretic who doesn't i think
00:24:47.240
code is like classically masculine he doesn't look like chris williamson is not referred to as soy
00:24:53.480
but he also comes across this kind of yeah like on his podcast i was called soy a lot by by his fans
00:24:59.060
no the last time i brought this up as something i was confused about the theory in the comments was
00:25:05.480
that it was because i wasn't extra muscular but i am more muscular than the jolly heretic and he is
00:25:12.460
not called soy and i am considered soy by his fan base so i don't think muscles is the answer
00:25:19.440
but what is different is the jolly heretic does not he doesn't he's not cheerful he's not bubbly he
00:25:24.740
doesn't smile yeah no i think i think that's it please comment below if that's what being soy is
00:25:31.660
is it as soyness in their heads it's not no it's not just vitalism because bronze age pervert is very
00:25:38.360
vitalistic but in a very angry let's go out and kill people kind of way and you're in a very vitalistic
00:25:44.320
like i'm having the time of my life running through disneyland kind of way not necessarily happy
00:25:49.480
other people have talked about this in like comments on our channel they're like you know
00:25:54.860
before adapting your guys like pronatalist whatever mindset like i was really like depressed
00:25:59.680
and suicidal and now i'm like really happy with life and it's weird how when you adopt this like
00:26:05.480
future oriented like optimistic mindset all of a sudden you just don't notice all of the bad stuff
00:26:11.120
as much or you don't ruminate on it as much and you're just like you know what i'll figure this
00:26:15.200
out and i think that a lot of the other parts the things that that make me soy versus not soy is
00:26:21.620
it's people who focus on the the doomerism i mean to an extent somebody like bronze age pervert i don't
00:26:28.460
think he has kids for example i don't think he really has any sort of an invested interest in the
00:26:33.400
future i like his content but i can understand why he's pessimistic right like he's not in this
00:26:39.320
intergenerational game he may get optimistic if and when he has kids so yeah it's we we've seen
00:26:47.260
this with other influencers as well people often comment on how like a great one here is zero
00:26:51.480
punctuation who's a youtuber who i absolutely love and have loved for ages who does those really cynical
00:26:56.520
game reviews and since he became a dad they've become like significantly less cynical this is the
00:27:01.380
one who is the british accent and talks really fast and yeah and he got fired from that company and
00:27:05.320
then started his own channel and now he's that's okay okay yeah i have listened to him forever his
00:27:10.060
voice is so iconic in my mind okay you know you know the dad that i most want to partner with and
00:27:15.620
i've almost thought about doing an episode just like profiling and deep diving on him is the guy who
00:27:20.340
created five nights at freddy's yeah he's got like six kids and he's been totally canceled for donating
00:27:25.880
to republicans dad goals yeah yeah yeah fans are trying to force you to think like them by saying that
00:27:33.220
republicans will kill them my favorite thing is when the the trans desk thing came out and they're
00:27:38.540
like x many trans people died per year there was a famous thing during the biden administration they
00:27:43.380
came out and they said this and then if you ran the numbers by trans people in the u.s it meant that
00:27:47.420
trans people were dying at a lower rate or being murdered at a lower rate than non-trans people they
00:27:52.360
just had never run the statistics and they were so like such a sign of their but anyway so okay i hear all
00:28:00.620
that so what platforms are going to continue to exist x is going to continue to exist yes youtube
00:28:04.920
is going to continue to exist i actually watch a lot of ai human creators on youtube that i really
00:28:12.200
like there is a channel where i've been sending simone i'll put a little clip from them here and you
00:28:17.900
guys oh the ai music channel they do really cute ai music that i absolutely love
00:28:23.260
type does he like me if he stares click the link i'm already scared buzzfeed quiz says girl he's
00:28:35.560
obsessed i'll believe it no need to stress i scroll through reddit for the clues love advice from someone
00:28:46.560
named dino dude 92 hearts in my throat when you say hi so i ask the stars and wi-fi google says you're
00:28:58.080
into me based on how you said hey last week and and i don't think that's bad right like i can have
00:29:07.180
one of my favorite performing artists be an ai creator now well and it's it's funny because even
00:29:13.040
before people were capable of doing this people attempted to say they were doing it even when they
00:29:18.840
couldn't like with the gorillas they were like yeah we're we're not really a band you know it's not
00:29:25.600
real and they they tried to frame that as a selling point same with hakuni mutsu right like she's not
00:29:31.280
real and that was a selling point so it's even weird to me that people are now proclaiming this bias
00:29:38.600
when in the past it was literally the competitive advantage of freak out on not safe for work sites
00:29:45.040
where you know you have like like chatting with people because the ais are like cleaning up the
00:29:50.680
regular women oh that's great guys are like hey you're not like exploiting me in the way they're
00:29:55.660
exploiting me yeah like hey a world where all of those jobs are taken by ais and the former
00:30:02.580
types need to go find husbands is that not like a better world i feel like there's so many cases
00:30:09.860
now of that happening though vans women i think they want partners when they can get one in the end
00:30:16.640
they they often do and it doesn't always go well for them but yeah who knows what's gonna happen in
00:30:25.480
the end but yeah i mean this is a great example once again of how we are super not ready for ai that
00:30:33.680
like it has been demonstrated once again that ai is going to be incredibly disruptive and change many
00:30:38.300
things very fundamentally and yet people are acting like it's not a thing people keep asking us like
00:30:44.200
how did you do all these things how well we just like you ask rock this and then you ask perplexity this
00:30:49.820
and then you like it's done and then you do it yeah yeah it's not an army of competent workers
00:30:56.540
under us like i think these people who are like well i tried ai two years ago and it wasn't very good and
00:31:02.820
it's like well try it again like it's different now like things are different now and and there's a
00:31:10.600
portion of the population that is like very anti-vitalistic that will just react negatively to
00:31:16.120
anything that represents you know human flourishing or the the continued advancement of our civilization
00:31:21.540
they're like oh i want an older way and so they they see our ai songs and they're like i think it's
00:31:25.960
also like which is for title cards they're like re and i think that it's it's so sad like it it doesn't
00:31:33.300
come off the way you think it comes off but subconsciously i think accepting it also requires
00:31:38.720
accepting a paradigm shift that i think a lot of people can are not they're not mentally ready for it
00:31:45.280
they're they're really not ready i agree with that yeah i mean they're like oh this is trained on human
00:31:53.500
data so it's stealing humans i'm like humans are trained on human data right you know what are you
00:31:58.180
talking about yeah but i mean it's very scary to think about how your job is going to become obsolete
00:32:04.580
and your your kid's education is around your job it's an entire like two-thirds of our civilization
00:32:10.680
that's about to become obsolete like yeah but remember how it was with covid people until it
00:32:16.240
actually happened and had been happening for at least two weeks people vehemently gaslit themselves
00:32:23.820
you remember i told you early in covid where i like sat simone i was like everything's gonna be shut
00:32:29.240
down like i i was like we ended up making a huge bet shutting down our company like way before anyone else
00:32:36.500
was doing this because we have a travel company and i wanted to save as many employees jobs as
00:32:41.160
possible and so not paying for the in-person locations or anything like that was the best way
00:32:45.780
to do that and so i was like look i'm i'm pulling the cord and she goes but nobody says they're all
00:32:51.340
saying this will be over in a month and i'm like that's not how viruses work like i don't know why
00:32:57.420
people are saying that but that is counter reality if i'm looking at the same thing with ai people are
00:33:03.120
saying oh well people will find a way around this no they they won't we right now are in an era of
00:33:10.520
cars and people are the horses or your mid people are at least the based interesting people will get
00:33:17.740
through this that's for sure but what about the rest of you you know yeah
00:33:21.940
is there anything you're going to do differently based on this or is this just another
00:33:28.460
pebble in the jar it really helps you understand what platforms are worth investing in and which
00:33:34.520
ones are not worth okay so your long twitter and your long youtube your short extra medium long
00:33:41.840
twitter i think i think this is going to affect twitter to an x sorry x let's not dead name yeah right
00:33:47.940
but youtube i think will be affected very little by this i think even when we get totally like
00:33:54.120
automated people talking on youtube they'll be within specific niches that's already happening
00:34:00.320
because people are already putting on those google generated podcasts and i love them they're great
00:34:05.480
i've watched some of them yeah like i'll be like oh i see what you did but also yeah this is a good
00:34:10.300
summary so i'm going to keep listening to it you're not easily going to get something like a replacement for
00:34:15.120
us in in your daily life so well we are a bit orthogonal and i think that's the thing yeah there
00:34:20.200
there are youtubers who we watch because they're orthogonal i think that great examples of that
00:34:25.540
are strange aeons for example she makes the weirdest most obscure stuff she she makes edible savory cakes
00:34:34.400
for her weird cats and you know ai can't do that and so yes i think it's about orthogonality if you
00:34:43.460
want to be a social media influencer again mid is dead you can't be mid you have to be lumpy
00:34:49.640
and weird and make savory swedish cakes for your hairless i had a stanford classmate who started a
00:34:56.960
podcast recently and i listened to it and i was like she sounds exactly like ai i know that's that's
00:35:02.440
the problem yeah i was like like you're able to get like high level people on your podcast and that's
00:35:07.800
cool right but like you sound like the ai podcast like maybe try to like mix up your accent or something
00:35:15.800
like i think she's trying to look professional and polished and again that's why i'm kind of also
00:35:21.420
long on us being weird looking because with filters and with cosmetic procedures and everything and of
00:35:28.600
course ai the norm the the mid the the forgettable is going to be perfect and symmetrical and and classically
00:35:36.380
beautiful and this stuff that will be memorable will be weird and we're not weird looking we're
00:35:41.520
spicy looking we're runway we're runway your brother and sister-in-law are our catalog and we
00:35:47.220
are runway but just keep telling myself that because runway looks some people say that you aren't
00:35:53.120
attractive and i'm like anyone by the way who thinks that you're not attractive and i'm not talking
00:35:57.440
about like personal preferences or anything you will not breathe and i and i i mean this very seriously
00:36:02.580
because if you go to like an airport or something like this and you're looking at like the generic
00:36:06.780
human population this is even like the wealthier part you're easily by any objective standard for
00:36:12.100
your age in the top one percent you are wearing husband goggles and the audience is going to admit
00:36:16.940
that but also no go to an airport go go go look at average humans not the humans you see online
00:36:23.980
well i'll just say any woman would wish she had the pain tolerance that i do there are other things
00:36:29.180
that you can't see that are more than skin deep that are very advantageous to i know i love your
00:36:34.760
pain tolerance it's a great point i'm making is and i think that this is actually really toxic and
00:36:39.720
going to lead to a lot of people sort of dropping out of the gene pool is they are cuing what they
00:36:45.140
think average attractiveness is to average attractiveness they see among the people that
00:36:50.120
they see online oh as was revealed with people asserting that marco robbie is mid when the barbie
00:36:56.740
movie came out and yeah like yeah does it they think that they're increasing their status yeah
00:37:02.800
whereas they're basically saying i may have a penis and balls but i will never reproduce i might as
00:37:07.300
well be a eunuch i might as well be a eunuch and that i think is really and then i'd point out here
00:37:12.680
i'm saying this in the context of you being a 37 year old woman who's pregnant with kid number five
00:37:18.340
right now like i've seen i've seen some stuff this is this is not who's acquired a freaking
00:37:23.480
company in in peru that puts so many years on my life yeah i cried my every night i cried so hard
00:37:33.140
into that freaking pillow that it would just be soaked and i just have to sleep on the the mattress
00:37:37.100
without the pillow it's so stressful i'm so glad that's over life just gets so much better so much
00:37:44.160
stressful stuff in our lives seriously at this point now when stuff comes up we're like okay sure
00:37:49.520
whatever it's good it's good it's good this is why they had boys go on cryptea like after that
00:37:58.460
they're like i don't know like whatever sure fine do your thing well if you want to check out our
00:38:05.260
school system please do check it out parasia.io or you can just the collins institute and it'll link
00:38:10.460
to it through the video it is yeah i've decided to start advertising on the show because it's like
00:38:15.040
actually good enough to advertise now like check it out genuinely good yeah genuinely good it's what
00:38:20.500
makes what i love about it is that the socratic tutor system that you can either verbally chat with or
00:38:25.700
text chat with via text it doesn't just tell you it doesn't just teach you it forces you to guess what
00:38:34.680
the true answer is and then explain to you what you got right what you might have missed and why things
00:38:40.520
are the way they are so it really forces you to understand the underpinnings of something
00:38:45.500
and the the deep influences behind it for example i was going through a piece one of the notes on
00:38:52.400
chinese architecture and it's like hey what are some philosophical influences that might have you know
00:38:59.000
what will want first like what are some characteristics of chinese architecture and i guess some things and
00:39:02.900
they're like yeah and then there's also something more can you guess it and like here's why and then you
00:39:07.160
know okay what are the philosophical influences that might have been underpinning this and i'm like
00:39:11.000
oh confucianism and they're like yeah but there's something else can you guess what it is and i'm like
00:39:14.740
ah and and it just like it really forces you to like you end up with some things yeah well then i i will
00:39:21.380
not forget any of these things because i spend so much time trying to think first like what am i
00:39:25.700
missing and and a lot of research has shown that when you like pre-test someone before they even go
00:39:31.260
into learning the subject and then they get it wrong their mind has been uh hypersensitized to
00:39:37.740
look for that information whereas if they're just passively receiving it there's no like net waiting
00:39:43.800
to catch it you know they're not holding that net up to catch that information because they haven't
00:39:47.940
been primed to do so so it's just really good in the way that it teaches i find it quite addicting it's
00:39:54.020
it's very fun i don't think there's any other educational ai system that comes close right now
00:39:59.760
like i've tried the other ones yeah i know i mean i've tried like learning games like duolingo
00:40:03.540
things like that and i'm like oh you have a streak going oh this is dramatically better than duolingo
00:40:07.540
oh yeah oh yeah we may not have the cute owl but we do have a better system yeah but i'm excited
00:40:14.060
that we put this together and i'm excited that people get to use it who care about lifelong learning
00:40:18.160
or their kids learning in a different environment and we're going to build stuffed animals for your
00:40:22.660
kids it'll bring them back to educational topics as well that's the wislet.ai system that we're
00:40:27.360
working on next and ai video games that we're building and one of the the firms draper was
00:40:33.400
looking to invest in us and they're like you guys are just split across too many projects
00:40:37.000
and i'm like no it's other people are leaving too many very obvious areas where people should
00:40:42.340
be developing stuff open and we have to handle it yeah that and like have you heard of ai like
00:40:47.260
future businesses are going to be started by the smallest teams just using a bunch of ai
00:40:53.040
you don't get it like the vcs who are like i want more focus it's like you don't get the age of ai
00:40:57.520
yeah if anything cross-disciplinary experience and we've experienced this from working on different
00:41:03.360
projects like our pronatalist advocacy bleeds into what we're doing with the school bleeds into what
00:41:09.340
malcolm and bruno are doing with reality fabricator it everything you you learn from all these experiences
00:41:15.600
and get inspiration it's it's to use a sadly destroyed word it is highly synergistic and it
00:41:23.360
creates a flywheel that i absolutely love so oh synergistic simone i know that sounds pretty lame
00:41:31.440
the problem is that synergy is such a cool concept and like the corporate world had to just destroy it
00:41:36.360
i'm very angry about that because like you know a good relationship like what makes a good relationship
00:41:41.980
special is synergy is that you get more than you put in and that you could get it from individual parts
00:41:47.340
and yet you literally if you just like walk into a room and you're like synergy people are like
00:41:53.200
uh-oh i'm leaving now like you might as well have farted like it would have if you know you might have
00:41:59.020
gotten like a laugh it's how people know you're a corpo yeah or if you if you say i have to socialize
00:42:05.260
this what i love i love that cyberpunk normalized the term corpo you know i'm like they're corpo stooges
00:42:13.500
like come on man it's a good it's a good term yeah my god versus the ai ronin like the the individualist
00:42:22.000
teams that are doing everything come on yeah you gotta have fun with it anyway i love you to death simone
00:42:27.580
i am very excited for my dinner tonight please make sure it's amazing don't forget when you are mixing
00:42:33.360
the sauce to do some chili oh yeah less vinegar more chili oil like equal parts chili oil and
00:42:40.780
sesame oil yeah with you want me to put the sesame seeds directly inside or do you want to put those
00:42:45.880
sesame seeds directly inside okay extra soy sauce extra okay oh actually very light on the soy sauce
00:42:51.940
use dark soy sauce okay dark soy sauce very light on the vinegar chili sauce sesame seeds and
00:43:00.020
and sesame oil i'm on it plus your butter on top of the steak presumably right oh absolutely yeah the
00:43:07.740
basil yeah you basically you're eating like the steak is is is half and then the sauces are the other half
00:43:14.760
and if you see for the i can probably if you see any curry mixes pre-made that you can mix to cook
00:43:23.580
the dish that you're cooking reheating oh chicken do we have any no we don't mix up one of the
00:43:32.960
we can just write it off you don't like it you don't you don't have to eat it no it's really good
00:43:39.040
base ingredients it's just completely unflavored how about i saute it with chili oil chili flake sauce
00:43:49.500
and some fish sauce oyster sauce yeah like what if i put in various sauces i would do oyster sauce a bit
00:44:02.920
of the fermented chili paste uh whatever it's called the korean one oh gochujang sauce gochujang
00:44:10.500
oyster sauce and then chili oil and i think that would taste really good all right let me write that down
00:44:17.440
um severe thunderstorm watch okay hold on yeah i keep getting warnings about that
00:44:22.480
gochujang sauce oyster sauce fermented wait you don't want the chili flakes right what was the third
00:44:31.400
one chili no chili oil and then you know you know those chili flakes i have in like the big jars
00:44:43.020
love you try for science i love you too we invented a new dish
00:44:49.500
did you by the way actually remember to saw a steak or do we not have the steak okay so we're
00:44:57.580
gonna do steak and pesto steak high peanut basil pesto that you made yes that is scano and parmesan cheese
00:45:05.260
it's gonna hopefully it'll be good i'm actually really excited for this dish i think it'll be
00:45:10.060
pretty good yeah i'd love to have it with like and of course your boss that is like a dumpling
00:45:15.020
dipping sauce if you're okay with that you made extra of that right and the pesto no i didn't make it i
00:45:20.060
only make enough for each night but i can i can always whip it up oh yeah yeah i do the the that so
00:45:26.540
you know i like to to split up between things i don't know that the pesto is going to be the most oh
00:45:30.660
yeah right well oh god i hope it's good because i made a lot of it i can't believe you bought that
00:45:36.100
much basil i was like it's like a dollar's worth or something or like you know it's it was a lot
00:45:42.500
because the thai basil you know you have to buy it all at once and oh my god i'm still so excited to
00:45:46.480
eat the rest of the leftovers that are thai basil they were so good can we do that tomorrow more of
00:45:52.060
the thai basil the yeah yeah i i i froze a lot of it so yes good you you may