Amjad Masad: The Cults of Silicon Valley, Woke AI, and Tech Billionaires Turning to Trump
Episode Stats
Length
1 hour and 55 minutes
Words per Minute
177.97597
Summary
In this episode of the show, I sit down with the founder and CEO of Replet, a software company focused on open source and open source software development. We talk about his journey to becoming a software developer, how he got started in the space, and why he thinks AI is a threat to human intuition and what it means for the future of the world. We also talk about why people are afraid of AI and why they believe it s a threat, and how it s going to change the world in the near future. If you like the show and want to support it, you can do so by becoming a patron, supporter, or just listen to the show. Thanks to our sponsor, Replet! Don t miss out on the most honest interview we can provide you with without fear or favor. We promise to bring you the most Honest content, the Honest interviews we can, without fear and without favor. -Tucker Carlsson and his team at Replet Subscribe to the Replet Podcast, where we bring honest, unedited content that's without fear, without favor, and without fear. Subscribe today using our promo code: PGPodcast, and get 20% off your first month with discount code POWER10 at checkout! We'll be giving you access to our newest ad-free version of our new podcast, POWER10! Subscribe on Apple Podcasts, where you get 10% off the best deals on all kinds of products, including Tucker's newest app, Poshmark, the Powerpoint, and much more! Learn more about your ad choices, including the latest deals, and access to the latest in the best places to help you can get the freshest and the most authentic source of news and insights from the best of the greatest places in the world, like the most influential places on the web, including our own direct access to everything you can access the most powerful, your most authentic and most authentic, the ultimate source of all things all the on the , we'll get a chance to learn how to access the latest and most honest interviews, and find out how to get the most ethical, the most profound, the most honest, the most up to understand what's out there sutra . , we'll cover it all about it all, we promise you'll get to know you, too you'll learn more about it, too so you won't have access to it all!
Transcript
00:00:00.000
welcome to tucker carlson show it's become pretty clear that the mainstream media are dying
00:00:11.600
they can't die quickly enough and there's a reason they're dying because they lie they lied so much
00:00:17.340
it killed them we're not doing that tucker carlson.com we promise to bring you the most
00:00:21.480
honest content the most honest interviews we can without fear or favor here's the latest
00:00:27.560
it does sound like you're like directly connected to ai development yes you're part of the ecosystem
00:00:33.260
yes and we benefited a lot from when it started happening like it was almost a surprise to a lot
00:00:39.620
of people but we saw it coming and you saw ai coming saw it coming yeah so you know this recent ai wave
00:00:47.840
you know it surprised a lot of people when chat gpt came out november 22 yeah a lot of people just lost
00:00:54.260
their mind like suddenly a computer can talk to me and that was like the holy grail i wasn't into
00:00:59.160
it at all really you know what paul graham one of my uh closest friends and sort of allies and mentors
00:01:06.480
he's a big silicon valley figure uh he's a writer kind of like you uh as uh you know he writes a lot
00:01:13.020
of essays and uh and he hates it he thinks it's like a midwit right and it's just like making people
00:01:19.700
write worse making people think worse worse or not think at all right not think as the iphone has
00:01:25.100
done as wikipedia and google have done yes we were just talking about that the um you know the
00:01:30.260
iphones ipads whatever they made it so that anyone can use a computer but they also made it so that
00:01:36.980
no one has to learn to program the original vision of computing was that this is uh this this is
00:01:45.000
something that's going to give us superpowers right jc lickleider the head of darpa while the
00:01:49.220
internet was developing wrote this uh uh essay called the man machine symbiosis and he talks about
00:01:55.740
how computers can be an extension of of of ourselves it can can help us grow we could become
00:02:02.660
you know there's this marriage between the type of intellect that the computers can do which is
00:02:09.960
high speed arithmetic whatever and the type of intellect that humans can do is more intuition
00:02:14.760
yes um but you know since then i think the the uh sort of consensus has sort of changed around
00:02:22.640
computing which is and i'm sure we'll get into that which is why people are afraid of ai is kind
00:02:27.400
of replacing us this idea of like computers and computing are threat because uh they're directly
00:02:35.100
competitive with with humans which is not really the the belief i hold they're extensions of us and i
00:02:40.180
think people learning the program and this is really embedded at the heart of our mission at
00:02:44.400
replet is what gives you superpowers whereas when you're just tapping you're kind of a consumer
00:02:51.540
you're not a producer of software and i want more people to be producers of software uh there's a book
00:02:58.400
by dog uh half not half setter roshkoff douglas roshkoff it's called program or be programmed
00:03:04.220
and the idea if you're not the one coding someone is coding you someone is programming you these
00:03:10.400
algorithms and you know social media they're programming us right so too late for me to learn
00:03:17.640
to code though i don't think so i don't think so i can't balance my checkbook assuming there are still
00:03:23.700
checkbooks i don't think there are but let me just go back to something you said a minute ago um
00:03:27.680
that the idea was originally as conceived by the darpa guys who made this all possible
00:03:32.840
that machines would do the math humans would do the intuition i wonder as machines become more
00:03:40.240
embedded in every moment of our lives if intuition isn't dying or people are less willing to trust
00:03:47.760
theirs i've seen that a lot in the last few years where something very obvious will happen
00:03:51.520
and people are like well i could sort of acknowledge and obey what my eyes tell me and my instincts are
00:03:58.920
screaming at me but you know the data tell me something different right like my advantage is i'm
00:04:05.440
like very close to the animal kingdom that's right and i just believe in smell yeah and so but i i wonder
00:04:10.960
if that's not a result of the advance of technology well i don't think it's inherent to the advance of
00:04:17.820
technology i think it's an it's it's a cultural thing right it's how to again this vision
00:04:22.440
of computing as a replacement for humans versus an extension machine for for humans and so you know
00:04:31.140
you go back you know bertrand russell uh wrote a book about history of philosophy and history of
00:04:37.140
mathematics and like you know going back to the ancients and uh pythagoras and and all these things
00:04:42.300
and you could tell in the writing he he was almost surprised by how much intuition played into
00:04:48.360
science and math and you know uh in the sort of ancient era of uh advancements and logic and
00:04:55.960
philosophy and and all of that whereas i think the culture today is like well you got to check your
00:05:02.860
intuition at the door yes yeah you're biased your intuition is racist or something and you have to
00:05:09.340
this is bad um and you have to be this like you know blank slate and like you trust the data but by the
00:05:16.260
way data is you can make the data say a lot of different things i've noticed wait can i just
00:05:20.460
ask a totally off-topic question that just occurred to me how are you this well i mean so you grew up
00:05:26.300
in jordan speaking arabic in a displaced palestinian family you didn't come to the u.s until pretty
00:05:32.320
recently you're not a native english speaker how are you reading bertrand russell yeah and how like
00:05:37.240
what was your education is everyone in is every palestinian family in jordan this well educated like
00:05:43.820
what kind of like yeah palestinian uh diaspora is like pretty well educated and you're starting to
00:05:50.440
see this generation our generation of kind of who grew up are starting to um sort of become more
00:05:57.120
prominent i mean in silicon valley you know a lot of c-suite and vp level executives a lot of them
00:06:03.560
are palestinian originally a lot of them wouldn't say so because there's still you know bias and
00:06:07.700
discrimination and all that but i wouldn't say they're palestinian they wouldn't say and you know
00:06:11.680
they're called adam and some of them some of the christian palestinians especially kind of blend
00:06:15.460
in right but there's a lot of them over there but how did you so how do you wind up reading i assume
00:06:22.040
you read bertrand russell in english yes how did you learn that you didn't grow up in an english-speaking
00:06:27.400
country yeah well jordan is kind of an english-speaking well it kind of is that's true right so so you know
00:06:33.340
it was it was a british colony i think one of the you know i you know the the independence they
00:06:38.280
like happened like 50s or something like that or maybe 60s so it was like pretty late in the you
00:06:44.200
know british uh sort of empire's history that jordan stopped being a colony so there was like a lot of
00:06:50.620
british influence i went to so my father my father is a government engineer he didn't he didn't have a
00:06:57.360
lot of money so we lived a very modest life kind of like middle lower middle class uh but he really
00:07:03.880
cared about education he sent us to private schools and in those private schools we uh uh we learned
00:07:10.040
kind of using british diploma right so igcse levels you know that's are you familiar with not at all
00:07:16.800
yeah so so uh you're part of the sort of british uh you know colonialism or whatever is like you know
00:07:23.480
education system became international uh i think it's a good thing uh yeah there are british schools
00:07:28.960
everywhere yeah yeah british schools everywhere and there's a good education system it gives students a
00:07:33.300
good level of freedom and autonomy to kind of pick the kind of things they're interested in
00:07:36.600
so i you know went to a lot of math and physics but also did did like random things i did child
00:07:42.080
development which i still remember and now that i have kids i actually use and in high school you do
00:07:48.360
that in high school and i uh what does that have to the civil rights movement what do you mean well
00:07:55.360
that's the only topic in american schools really yeah oh yeah you spend 16 years learning about the
00:08:00.860
civil rights movement so everyone can identify the edmund pettus bridge but no one knows anything
00:08:04.100
else oh god i'm so nervous about that with my kids no opt out trust me um that's so interesting so
00:08:12.100
when you when did you come to the u.s 2012 damn and now you've got a billion dollar company that's
00:08:18.600
pretty good yeah i mean america is amazing like i i just love this country uh it's given us a lot
00:08:24.020
of opportunities i just love the people like everyday people i like just talk to people i was just
00:08:28.540
talking to to my driver which she was like um you know i'm so embarrassed i didn't i didn't know who
00:08:33.840
dr carlson was good that's why i live here yeah i was like well good for you i think that means you're
00:08:40.700
just like you know you're just living your life and she's like yeah i'm you know i have my kids and
00:08:44.520
my chickens and my whatever i was like that's great that's it means you're happy you're happy yes
00:08:49.140
but so i'm sorry to digress i'm sorry to grass i'm sitting here you're referring to
00:08:54.000
all these books i'm like you're not even from here it's incredible uh so but back to ai um
00:09:00.040
and and to this question of intuition you don't think that it's in it's inherent so in other words
00:09:06.660
if my life is to some extent governed by technology by my phone by my computer by you know all the
00:09:12.860
technology embedded in like every electronic object you don't think that makes me trust machines more
00:09:18.540
than my own gut um you can choose to and i think a lot of people are being guided to to do that um
00:09:27.600
but ultimately you're giving away a lot of uh freedom uh and and you know there's a there's a
00:09:36.220
it's not just me saying that there's like a huge tradition of hackers and computer scientists that
00:09:43.440
kind of started uh ringing the alarm bell like really long time ago about like the way things
00:09:50.180
were trending which is you know more centralization less you know diversity of competition in the market
00:09:56.680
yes uh and you have like one global social network as opposed to many now it's actually getting a little
00:10:02.840
better but um and you had a lot of these people you know start you know the crypto movement i know you
00:10:09.980
you were at the bitcoin conference recently and you told them uh cia started bitcoin they got really
00:10:14.640
angry and on twitter i i don't know that but until you can tell me who satoshi was i i have some
00:10:20.340
questions what i actually have a have a feeling about who satoshi was but that's a separate conference
00:10:26.660
no it's not let's just stop right now because i can't i'll never forget to ask you again who is
00:10:30.140
satoshi there's a guy his name is paul larue by the way for those watching you don't know who satoshi
00:10:35.920
satoshi is the the the pseudonym that we use for the person who created bitcoin but we don't know
00:10:42.800
it's amazing you know it's this is this thing that was created we don't know who created it he didn't
00:10:47.440
he never moved the money i don't think maybe there was some activity here and there but like there's
00:10:51.920
like billions hundreds of billions of dollars locked in so we don't know the person is they're not
00:10:56.640
cashing out and it's like pretty crazy story right amazing so paul larue yeah paul larue uh
00:11:03.480
was uh you know crypto hacker in uh rhodesia um and before zimbabwe and he created something called
00:11:16.780
encryption for the masses em4 and was one of the early by the way i think snowden used em4 as part
00:11:22.840
of his uh as part of his hack um so he was one of the people that really you know made it so that
00:11:28.940
uh cryptography is accessible to more people however he did he did become a criminal he became
00:11:33.540
a criminal mastermind in manila he was really controlling the the city almost you know he paid
00:11:39.680
off all the cops and everything he was making so much money for so from so much criminal activity
00:11:45.340
his nickname was letoshi with an l and so there's like a lot of you know circumstantial evidence there's
00:11:52.360
no like cutthroat evidence but i just have a feeling that he generated so much cash he didn't know
00:11:58.120
what to do with it where to store it and on the side he was building bitcoin to be able to store
00:12:02.860
all that all that cash and around that same time that satoshi disappeared he went to jail he got
00:12:08.680
he got booked for um for all the crime he did he recently got sentenced to 20 years 25 years of prison
00:12:16.400
i think judge asked him like what would you do if you if you would go out and he's like i would build
00:12:20.540
an asec chip to mine bitcoin um and so look you know this is a strong opinion loosely held but it's
00:12:27.620
just like there's so he is currently in prison he's currently in prison yeah in in this country or the
00:12:33.100
philippines i think this country because he was doing all the crime here he was selling drugs online
00:12:38.000
essentially huh we should go see him in jail yeah yeah to check out his story is fascinating i'm i'm sorry
00:12:44.540
i just had to get that out of you so um i keep digressing uh so you see ai and you know you're part
00:12:52.640
of the ai ecosystem of course but um you don't see it as a threat no no no i don't see it as a threat
00:12:59.700
at all and i think and i you know i i heard some of your you know podcasts with joe rogan whatever
00:13:04.740
and you're like we should nuke the data centers and and i'm excitable yeah on the basis of very
00:13:10.220
little information well actually yeah well actually tell me what is your theory about the threat of ai
00:13:14.120
you know i always i want to be the kind of man who admits up front his limitations and his ignorance
00:13:19.980
um and on this topic i'm legitimately ignorant but i have read a lot about it and i've read
00:13:24.680
most of the alarmist stuff about it and the idea is as you well know that the machines become so
00:13:32.480
powerful that they achieve a kind of autonomy and they though designed to serve you wind up ruling you
00:13:39.560
yeah and um you know i'm i'm really interested in uh ted kaczynski's writings his two books that he
00:13:49.880
wrote obviously as to say ritually i'm totally opposed to letter bombs or violence of any
00:13:54.580
kind um but ted kaczynski had a lot of provocative and thoughtful things to say about technology
00:14:01.100
it's almost like having live-in help which you know people make a lot of money they all want to
00:14:05.380
have live-in help but the truth about live-in help is you know they're there to serve you but you wind
00:14:09.700
up serving them it inverts and ai is a kind of species of that uh that's the fear and i don't want
00:14:17.340
to live i don't want to be a slave to a machine any more than i already am so it's kind of that simple
00:14:22.300
and then there's all this other stuff you know a lot more about this than i do and you're in that
00:14:25.720
world but yeah that's my concern that's actually a quite valid concern i would like decouple the
00:14:30.660
existential threat concern from the concern and we've been talking about this of like machine like
00:14:36.940
us being slaves to the machines and i i think ted kaczynski's um critique of technology is actually
00:14:45.000
one of the best yes thank you yeah i um i wish he hadn't killed people of course because i'm against
00:14:51.140
killing but i also think it had the opposite of the intended effect he did it in order to bring
00:14:56.320
attention to his uh to his thesis and ended up obscuring it uh but i i really wish that every
00:15:04.220
person in america would read but not just the his manifesto but the book that he wrote from prison
00:15:09.160
because they're just so at the least they're thought-provoking and really important yeah yeah i mean
00:15:14.460
briefly and we'll get to existential risk in a second but uh he talked about this thing called
00:15:18.660
the power process which is he thinks that it's intrinsic to human happiness to to struggle for
00:15:26.120
survival to go through life as a child as an adult build up yourself get married have kids and then
00:15:34.800
become the elder and then die right exactly and he he thinks that modern technology kind of disrupts
00:15:40.600
this process and it makes people miserable how do you know that i read it i i'm very curious i like
00:15:46.760
i read a lot of things and i just don't have mental censorship in a way like i can i'm really
00:15:55.500
curious i'll read anything do you think being from another country has helped you in that way
00:15:59.400
yeah and and i also i think just my childhood um i was like always different i i was when i had hair
00:16:07.160
it was all red it was bright red um and my my whole family is kind of uh or at least half of my family
00:16:13.920
are redheads and um and you know because of that experience i was like okay i'm different i'm comfortable
00:16:22.480
being different i'll be i'll be different and uh and you know that just commitment to not worrying
00:16:29.200
about anything you know about conforming or like it was forced on me that i'm not conforming just by
00:16:35.900
virtue of yes different and and uh and being curious and being you know um good with computers and all that
00:16:44.860
um i think that carried me through life it's just i like i get you know i get like almost a disgust
00:16:53.560
reaction to conformism and like mob mentality oh i couldn't agree i had a similar experience
00:16:59.120
childhood i totally agree with you yeah we've traveled to an awful lot of countries on this show
00:17:03.900
to some free countries the dwindling number and a lot of not very free countries places famous for
00:17:09.720
government censorship and wherever we go we use a virtual private network of vpn and we use express
00:17:16.280
vpn we do it to access the free and open internet but the interesting thing is when we come back here
00:17:23.900
to the united states we still use express vpn why big tech surveillance it's everywhere it's not just
00:17:31.160
north korea that monitors every move its citizens make no that same thing happens right here in the
00:17:37.340
united states and in canada and great britain and around the world internet providers can see
00:17:42.580
every website you visit did you know that they may even be required to keep your browsing history
00:17:48.140
on file for years and then turn over to federal authorities if asked in the united states internet
00:17:54.240
providers are legally allowed to and regularly do sell your browsing history everywhere you go online
00:18:00.200
there is no privacy did you know that well we did and that's why we use express vpn and because we
00:18:06.660
do our internet provider never knows where we're going on the internet they never hear it in the
00:18:11.420
first place that's because a hundred percent of our online activity is routed through express vpn's
00:18:16.900
secure encrypted servers they hide our ip address so data brokers cannot track us and sell our online
00:18:24.160
activity on the black market we have privacy express vpn lets you connect to servers in 105 different
00:18:31.600
countries so basically you can go online like you're anywhere in the world no one can see you
00:18:37.500
this was the promise of the internet in the first place privacy and freedom those didn't seem like
00:18:43.540
they were achievable but now they are express vpn we cannot recommend it enough it's also really easy
00:18:50.380
to use whether or not you fully understand the technology behind it you can use it on your phone
00:18:54.640
laptop tablet even your smart tvs you press one button just tap it and you're protected you have
00:19:01.780
privacy so if you want online privacy and the freedom it bestows get it you can go to our special
00:19:09.520
link right here to get three extra months free of express vpn that's expressvpn.com slash tucker express
00:19:18.240
e-x-p-r-e-s-s vpn.com slash tucker for three extra months free tucker says it best their credit card
00:19:29.200
companies are ripping americans off and enough is enough this is senator roger marshall of kansas
00:19:35.180
our legislation the credit card competition act would help in the grip visa and mastercard have on us
00:19:42.320
every time you use your credit card they charge you a hidden fee called a swipe fee
00:19:47.260
and they've been raising it without even telling you this hurts consumers and every small business
00:19:53.080
owner in fact american families are paying eleven hundred dollars in hidden swipe fees each year
00:19:59.140
the fees visa and mastercard charge americans are the highest in the world double candidates and eight
00:20:06.040
times more than europe's that's why i've taken action but i need your help to help get this passed
00:20:11.820
i'm asking you to call your senator today and demand they pass the credit card competition act
00:20:18.640
paid for by the merchants payments coalition not authorized by any candidate or candidates committee
00:20:26.100
hillsdale college offers many great free online courses including a recent one on marxism socialism and
00:20:43.460
communism today marxism goes by different names to make itself seem less dangerous names like critical
00:20:49.580
race theory gender theory and decolonization no matter the names this online course shows it's the
00:20:55.880
same marxism that works to destroy private property and that will lead to famines show trials and gulags
00:21:02.200
start learning online for free at tucker4hillsdale.com that's tucker4hillsdale.com
00:21:13.020
it's the heart of america it's the heart of america the heart of america by sycamore on apple spotify and
00:21:21.400
devices near you baby now this is the time to leave our defenses behind
00:21:30.660
so kaczynski's thesis that struggle is not only inherent to the condition but an essential part
00:21:56.860
yes of your evolution as a man or as a person um and the technology disrupts that
00:22:03.080
i mean that seems right to me yeah and i actually struggle to sort of um dispute that uh despite
00:22:11.880
being a technologist right ultimately uh again like i said it's like one of the best critique i think we
00:22:17.180
can spend the whole podcast kind of really trying to tease it apart i think ultimately where i kind of
00:22:24.440
defer and again it just goes back to a lot of what we're talking about my views and technology as an
00:22:29.660
extension of us it's like we just don't want technology to be uh a thing that's just merely
00:22:35.820
replacing us we want it to be an empowering thing and what we do at replit is we empower people to learn
00:22:43.180
to code to build startups to build companies to become entrepreneurs and um and i think you can
00:22:49.840
uh in this world you have to create the power process you have to struggle and uh yes you can
00:22:58.840
you know this is why i'm also you know a lot of technologists talk about ubi and universal basic
00:23:04.760
oh i know i think it's all wrong because it just goes against human nature thank you so i think
00:23:10.880
you want to kill everybody put them on the dole yes yes so you know i don't think technology is
00:23:16.960
inherently at odds with the power process i'll leave it at that and we can go to to to existential
00:23:24.860
threat yeah of course sorry boy am i just aggressive i can't believe i interview people for a living
00:23:31.420
we had dinner last night that was awesome it was one of the best dinners yeah but we we hit about
00:23:38.320
400 different threats yes that's amazing so so that's what's what's out there i know um i'm sort
00:23:46.600
of convinced of it my or it makes sense uh to me and i'm kind of threat oriented anyway so people with
00:23:53.020
my kind of personality are like sort of always looking for you know the big bad thing that's coming
00:23:57.980
the asteroid or the nuclear war the ai slavery um but i know some pretty smart people who very smart
00:24:06.020
people who are much closer to the heart of ai development who also have these concerns um and
00:24:12.740
i think a lot of the public shares these concerns yeah and the last thing i'll say before soliciting
00:24:18.700
your view of it much better informed view of it is that there's been surprisingly and tellingly little
00:24:25.460
conversation about the upside of ai so instead it's like this is happening and if we don't do it china
00:24:32.180
will that may i think that's probably true but like why should i be psyched about it like what's
00:24:38.100
the upside for me right do you know what i mean normally when some new technology or or huge change
00:24:44.060
comes the people who are profiting from like you know what's going to be great it's going to be great
00:24:48.500
you're not going to ever have to do x again you know you just throw your clothes in a machine and
00:24:52.460
press a button and they'll be clean yes i'm not hearing any of that about it that's a very
00:24:56.900
pseudo-observation and i'll exactly tell you why um and to tell you why it's like a little bit of a
00:25:02.900
long story because i think there is a organized effort to scare people about ai organized organized
00:25:10.660
yes and so this this starts with a mailing list in the 90s is a transhumanist mailing list uh called uh
00:25:19.940
the extropians and these extropians they might got it wrong extropia or something like that but
00:25:27.140
they um they believe in the singularity so the singularity is a moment of time where uh you know
00:25:35.860
ai is progressing so fast or technology in general progressing so fast that you can't predict what
00:25:40.100
happens it's self is self-evolving and it just all bets are off you know we're entering a new world
00:25:47.540
where you just can't predict it where technology can't be controlled technology can't be controlled
00:25:51.620
it's going to remake remake everything and those people believe that's a good thing because the
00:25:57.380
world now sucks so much and we are you know we are imperfect and unethical and all sorts of
00:26:03.380
irrational whatever and so they really wanted to um for the singularity to happen and there's this
00:26:09.860
young guy on this uh list his name is eliezer itkowski and he claims he can write this ai and he would
00:26:16.820
write like really long essays about how to build this ai uh suspiciously he never really publishes
00:26:23.540
code and it's all just pros about how he's going to be able to build ai anyways he he's able to to
00:26:30.500
fundraise they started this thing called the singularity institute a lot of people were excited
00:26:35.220
about the future kind of invested in him peter thiel most famously and he spent a few years trying to
00:26:40.820
build an ai again never published code never published any like real progress uh and and then
00:26:49.060
came out of it saying that not only you can't build ai but if you build it it will kill everyone so he
00:26:54.820
kind of switched from being this optimist you know singularity is great to like actually ai will for
00:27:00.980
sure kill everyone and uh and then he was like okay the reason i made this mistake is because i was
00:27:08.100
irrational and the way to get people to understand that ai is going to kill everyone is to make them
00:27:13.700
rational so he started this blog called less wrong and less wrong is it like walks you through steps to
00:27:19.860
becoming more rational look at your biases examine yourself you know sit down meditate on the all the
00:27:25.540
irrational decisions you've made and try to correct them um and then they start this thing called
00:27:31.140
center for advanced rationality or something like that cfar and they're giving seminars about
00:27:37.220
rationality but the intention seminar about rationality what's that like uh i've never been to one but my
00:27:44.580
guess would be if they will talk about the biases whatever but they have also like weird things where
00:27:50.500
they have this almost struggle session like thing called debugging a lot of people wrote blog posts
00:27:55.380
about how that was demeaning and it caused psychosis in some people 2017 that community there was like
00:28:02.580
collective psychosis a lot of people were kind of going crazy and it's all written about it on the
00:28:07.460
internet debugging so that would be like kind of your classic cult technique yeah where you have to strip
00:28:13.060
yourself bare like auditing and scientology or yes it's very common yes yeah so yeah it's a constant
00:28:20.100
in cults yes uh is that what you're describing yeah i mean that's what i read on these accounts yeah
00:28:25.780
they will sit down and they will like audit your mind and tell you where you're wrong and and all of
00:28:30.180
that um and and it's it caused people a huge distress on young guys all the time like talk about
00:28:37.380
how going into that community has caused them huge distress and there were like offshoots of this
00:28:42.100
community where there were suicides there were murders there were a lot of really dark and deep
00:28:47.860
shit and the other thing is like they kind of teach you about rationality they recruit you to ai risk
00:28:53.220
because if you're rational you know you're a group we're all rational now we learned the art of
00:28:58.260
rationality and we agree that ai is going to kill everyone therefore everyone outside of this group is
00:29:05.220
wrong and we have to protect them ai is going to kill everyone and but but also they believe other
00:29:11.460
things like they believe that you know uh polyamory is is rational and and everyone that holly amory yeah
00:29:18.420
like like uh you can have sex with multiple partners essentially that but they think that's i mean
00:29:24.020
i think it's um it's certainly a natural desire if you're a man to sleep with more and different
00:29:30.100
women for sure but it's rational in the sense how like you've never met a happy polyamorous long term
00:29:39.140
i've done a lot of them not a single one so how it might be self-serving you think to recruit more
00:29:46.420
impressionable people into yeah and their hot girlfriends yes right um so that's rational yeah
00:29:54.980
supposedly um and so they you know they convince each other of all these you're cult-like behavior
00:30:00.260
and the crazy thing is like this group ends up being super influential uh because um you know they
00:30:09.220
recruit a lot of people that are interested in ai and the ai labs and the people who are starting these
00:30:16.020
companies were reading all this stuff so elon uh you know famously read a lot of nick bostrom as
00:30:23.540
as kind of an adjacent figure to the rationality community he was part of the original mailing list
00:30:29.060
i think he would call himself a you know rational part of the rational community but he wrote a book
00:30:33.380
about ai and how ai is going to you know kill everyone essentially i think he moderated his views more
00:30:38.340
recently but originally he was one of the people that are kind of banging the alarm um and you know the
00:30:44.980
foundation of open ai uh was based on on a lot of these fears like elon had fears of ai killing
00:30:52.740
everyone uh he was afraid that google was going to do that and so they you know group of people i don't
00:30:58.660
think everyone at open ai really believed that but you know some of the original founding story was that
00:31:04.180
and they were recruiting from that community it was so much so when um you know sam altman got fired
00:31:10.340
recently he was fired by someone from that community um someone who started with effective
00:31:18.500
altruism which is another offshoot from that community really and so the ai labs are intermarried
00:31:26.180
in a lot of ways with this with this community and so it it ends up they kind of you know borrowed a
00:31:32.660
lot of their talking points but by the way a lot of these companies are great companies now and i think
00:31:36.420
they're cleaning up house but there is i mean it's i'll just use the term it sounds like a cult
00:31:41.940
to me yeah i mean it has the hallmarks of it in your description yeah and can we just push a little
00:31:47.780
deeper on what they believe you say they are transhumanists yes what is that well i think i think
00:31:53.380
they're just unsatisfied with with human nature unsatisfied with the current ways um we're constructed
00:32:00.980
um and that you know we're irrational we're unethical uh and so they they start uh they long for
00:32:10.340
the world where we can become more rational more ethical by transforming ourselves either uh by merging
00:32:18.500
with ai via chips or or what have you changing our bodies um and like fixing fundamental issues that
00:32:27.060
they perceive with humans via modifications and merging with machines it's just so interesting
00:32:32.260
because um and so shallow and silly um like a lot of those people i have known are not that smart
00:32:39.540
actually because the best things i mean reason is important and we should in my view given us by god
00:32:45.620
and it's really important and being irrational is bad on the other hand the best things about people
00:32:52.580
their best impulses are not rational mm-hmm i believe so too there is no rational justification
00:32:59.620
for giving something you need to another person yes for spending an inordinate amount of time helping
00:33:04.580
someone for loving someone those are all irrational now banging someone's hot girlfriend i guess that's
00:33:09.540
rational but that's kind of the lowest impulse that we have actually we'll wait to hear about uh
00:33:14.740
effective altruism so they they think our natural impulses that you just talked about are indeed
00:33:20.500
irrational and there's a guy uh his name is peter singer uh philosopher from australia the infanticide
00:33:26.980
guy yes he's so ethical he's for killing children yeah i mean so their philosophy is utilitarian
00:33:32.980
utilitarianism is that you can calculate ethics yeah and you can start apply it and you get into really
00:33:38.260
real weird territory like you know if um you know there's all these problems all these thought experiments
00:33:44.260
like you know you you have you know two people at the hospital uh requiring some organs of another
00:33:51.700
third person that came in for a regular check-up you uh or they will die you're ethically um uh
00:34:00.020
you you're supposed to kill that guy get his organ and and put it into into the other two and so
00:34:06.500
it gets i don't think people believe that per se i mean but but they um but but there's so many uh
00:34:14.980
problems with that there's another belief that they have can i say that belief or that conclusion
00:34:21.380
grows out of the core belief which is that you're god it's like a normal person realizes
00:34:27.460
sure it would help more people if i killed that person and gave his organs to you know a number of
00:34:32.340
people like that's just a math question yeah true but i'm not allowed to do that because i didn't
00:34:38.020
create life i don't have the power i'm not allowed to make decisions like that yes because i'm just a
00:34:43.860
silly human being who can't see the future and is not omnipotent because i'm not god yeah i feel like
00:34:48.180
all of these conclusions stem from the misconception that people are gods yes i i does that sound right no i
00:34:55.460
agree i mean a lot of the um yeah i think it's you know they're at root they're just fundamentally
00:35:06.180
unsatisfied with humans and maybe perhaps hate hate yeah humans well they're deeply disappointed yes
00:35:13.940
i think that's such a i've never heard anyone say that um as well that they're disappointed with human
00:35:19.780
nature they're disappointed with human condition they're disappointed with people's flaws and i feel
00:35:23.940
like that's the i mean on one level of course i mean you know we should be better and but that
00:35:29.540
we used to call that judgment which we're not allowed to do by the way um that's just super
00:35:34.820
judgy actually what they're saying is you know you suck and it's just a short hop from there to you
00:35:42.100
should be killed i think i mean that's a total lack of love whereas a normal person a loving person says
00:35:48.900
you kind of suck i kind of suck too yes but i love you anyway and you love me anyway and i'm
00:35:54.660
grateful for your love right that's right that's right well they'll say you suck join our rationality
00:35:59.380
community have sex with us uh um so but can i just clarify or these these aren't just like
00:36:07.700
you know support staff at these companies like are there so you know you've heard about sbf and fdx
00:36:12.500
yeah they had what's called a polycule yeah right they were all having sex with each other uh just
00:36:18.740
given now i just want to be super catty and shallow but given some of the people they were having sex
00:36:23.220
with that was not rational no actual person would do that come on now yeah that's true yeah um well so
00:36:32.260
you know uh yeah yeah it's um what's even what's even more disturbing there's you know another ethical
00:36:40.260
component to their philosophy called long-termism and this this comes from the effective altruist
00:36:48.100
sort of branch of of rationality long-termism long-termism and so what they think is in the
00:36:55.700
future if we did the if we made the right steps there's going to be a trillion humans trillion minds
00:37:01.460
they might not be humans it might be ai but they're going to be trillion minds who can experience utility
00:37:06.020
can experience good things fun things whatever if you're utilitarian you have to put a lot of weight
00:37:11.940
on it and maybe you discount that sort of like discounted cash flows uh but you still you know
00:37:18.980
have to posit that you know you know if if there are trillions perhaps many more people in the future
00:37:26.900
you need to value that very highly even if you discounted a lot it ends up being valued very highly
00:37:32.260
so a lot of these communities end up all focusing on ai safety because they think that ai because
00:37:38.100
they're rational they arrived and we can talk about their arguments in a second they arrived at the
00:37:43.140
conclusion that ai is going to kill everyone therefore effective altruists and rational community
00:37:48.180
all these branches they're all kind of focused on ai safety because uh that's the most important
00:37:54.180
thing because we want a trillion people in the future to be great but you know when you're assigning
00:38:00.180
sort of value uh that high it's sort of a form of pascal's wager it is um sort of uh you can justify
00:38:09.940
anything including terrorism including uh doing really bad things if you're really convinced that ai is
00:38:18.980
going to kill everyone um and the future holds so much value uh more value than any living human today
00:38:29.380
has value you might justify really doing anything and so built into that it's a dangerous framework
00:38:36.980
but it's the same framework of every genocidal movement yes from you know at least the french
00:38:44.100
revolution to present yes of a glorious future justifies a bloody present yes and look i'm not
00:38:51.220
accusing them of genocidal intent by the way i don't know them but i but those ideas lead very quickly
00:38:56.740
to the camps i feel kind of weird just talking about people just generally i'd like to talk
00:39:00.260
about ideas about things but if they were just like a you know silly berkeley cult or whatever
00:39:06.180
and they didn't have any real impact in the world i wouldn't care about them but what's happening is
00:39:12.980
that they were able to convince a lot of billionaires of these ideas i think elon maybe changed his mind
00:39:19.540
but at some point he was convinced of these ideas uh i don't know if he gave them money i think
00:39:23.460
there was a story at some point that was to draw that he was thinking about it but um a lot of other
00:39:28.340
billionaires gave them money and now they're organized and they're in dc lobbying for ai like
00:39:33.300
regulation they're uh they're behind the ai regulation in california uh and actually profiting from it
00:39:41.380
uh there was a story in uh pirate wares where this you know the main sponsor uh dan hendricks behind
00:39:49.140
sb 1047 um started a company at the same time that certifies the safety of ai and as part of the
00:39:57.060
bill it says that you have to get certified by a third party so there's there's aspects of it that
00:40:02.420
are um kind of let's let's profit from it by the way this is all allegedly based on this article i i don't
00:40:08.500
know for sure uh i think senator scott uh weiner was trying to do the right thing with the bill but he
00:40:14.980
was listening to a lot of these cult members let's call them and um and and they're very well
00:40:23.940
organized and uh also a lot of them still have connections to the big ai labs and uh some of the
00:40:30.420
work there and they would want to create you know a situation where there's no competition in ai
00:40:37.220
regulatory capture per se and so i'm not saying that these are like the direct motivations all of them
00:40:42.900
are true believers but you know you might you you might kind of infiltrate this group and kind of
00:40:49.220
direct it in a way that benefits these corporations yeah well i'm from dc so i've seen a lot of
00:40:55.860
instances where you know my bank account aligns with my beliefs thank heaven yeah this kind of happens
00:41:01.780
winds up that way it's funny um climate is the perfect example there's never one climate solution
00:41:08.340
that makes the person who proposes it poorer or less power exactly ever not one we've told you
00:41:13.540
before about hallow it is a great app that i am proud to say i use my whole family uses it's for
00:41:20.340
daily prayer and christian meditation and it's transformative as we head into the start of
00:41:25.460
school in the height of election season you need it trust me we all do things are going to get
00:41:30.740
crazier and crazier and crazier sometimes it's hard to imagine even what is coming next so with
00:41:36.740
everything happening in the world right now it is essential to ground yourself this is not some
00:41:43.780
quack cure this is the oldest and most reliable cure in history it's prayer ground yourself in prayer
00:41:51.060
and scripture every single day that is a prerequisite for staying sane and healthy and maybe for doing
00:41:57.220
better eternally so if you're busy on the road headed to kids sports there is always time to pray and
00:42:02.820
reflect alone or as a family but it's hard to be organized about it building a foundation of prayer
00:42:08.820
is going to be absolutely critical as we head into november praying that god's will is done in this
00:42:13.620
country and that peace and healing come to us here in the united states and around the world christianity
00:42:19.380
obviously is attack under attack everywhere that's not an accident why is christianity the most peaceful
00:42:26.260
of all religions under attack globally did you see the opening the paris olympics there's a reason
00:42:31.860
because the battle is not temporal it's taking place in the unseen world it's a spiritual battle
00:42:37.860
obviously so try halo get three months completely free at halo that's halo.com tucker if there's
00:42:46.180
ever a time to get spiritually in tune and ground yourself in prayer it's now halo will help
00:42:51.620
personally and strongly and totally sincerely recommend it halo.com tucker
00:43:11.620
i wonder like about the core assumption which i've had up until right now
00:43:17.220
that these machines are capable of thinking yeah is that true so let's go through their chain of
00:43:27.060
reasoning i think the fact that it's uh it's a stupid cult-like thing or perhaps actually a cult
00:43:34.740
does not automatically mean that their arguments are that's right that's exactly right uh i think it
00:43:40.340
does you you do have to kind of discount some of the arguments because it comes from crazy people but
00:43:46.500
the argument the chain of reasoning is that um humans are general intelligence we have these
00:43:53.540
things called brains brains are computers they're based on purely physical phenomena that we know they're
00:44:00.980
computing and if you agree that humans are computing and therefore we can build a general intelligence in
00:44:11.220
the machine and if you agree up till this point if you if you're able to build a general intelligence
00:44:17.700
in the machine even if only at human level then you can create a billion copies of it and then it becomes
00:44:25.300
a lot more powerful than any one of us and because it's a lot more powerful than any one of us it would
00:44:31.620
want to control us or it would want it would not care about us because it's such it's more powerful kind
00:44:38.020
of like we don't care about ants we'll step on ants no problem right because these machines are so
00:44:43.540
powerful they're not gonna care about us and i i sort of get off the train at the first chain of of
00:44:49.860
reasoning but every one of those steps i have problems with um the first step is um the mind
00:44:57.380
is a computer and you know based on what and the idea is oh well if you don't believe that the mind
00:45:05.300
is a computer then you believe in some kind of spiritual thing well you know uh well you have to
00:45:12.740
convince me you you haven't presented an argument but but but but the idea that like speaking of
00:45:18.180
rational but yeah this is what reason looks like right um the the idea that we have a complete
00:45:25.140
description of the universe anyways is wrong right or we don't have a universal physics we have physics
00:45:32.420
of the small things we have physics of the big things we can't really cohere them or combine them
00:45:37.300
so just the idea that you being a materialist is sort of incoherent because we don't have a complete
00:45:42.260
description of the world that's one thing that's a slight argument i'm not gonna no no no it's
00:45:46.020
it's a very interesting argument so so you're saying as someone who i mean you're a scientist
00:45:50.980
you're effectively a scientist um you just state for viewers who don't follow this stuff
00:45:57.220
like the limits of our knowledge of physics yeah so you know we have essentially two conflicting
00:46:02.260
theories of physics these systems can't be kind of married they're not a universal system you can't
00:46:08.420
use them both at the same time well that suggests uh a profound limit to our understanding yes what's
00:46:17.140
happening around us in the natural world does it yes it does and i think uh this is again another error
00:46:22.660
of the rationalist types is that just assume that you know we were so much more advanced in our science
00:46:28.340
than we actually are so it sounds like they don't know that much about science yes
00:46:32.900
okay thank you thank you i'm sorry to ask you to pause yeah that's not even the main crux of my
00:46:38.500
argument uh there is um a uh philosopher slash mathematician slash scientist wonderful his name is uh
00:46:46.900
sir uh roger penrose i love how the british kind of give the sir title someone has accomplished um
00:46:54.180
the uh he wrote this book called the emperor's new mind in the in the uh and it's it's based on you
00:47:02.740
you know the emperor's new clothes the idea the idea that you know the emperor's kind of naked and
00:47:06.740
and uh in his opinion the argument that the mind is a computer is a sort of consensus argument that
00:47:14.740
is wrong the emperor's naked it's not really an argument it's an assertion yes it's an assertion
00:47:19.140
that is fundamentally wrong and the way he proves it is very interesting um there is in mathematics there's
00:47:26.420
uh something called girdle's incompleteness theorem and you know what that says is um there are
00:47:36.020
statements that are true that can't be proved in mathematics so he constructs girdle constructs a
00:47:43.700
like a number uh system where he can start to make statements about this number system so the you know
00:47:51.300
he he creates a statement that's like this statement is unprovable in system f where the
00:47:56.340
where the system is f the whole system is f well if you try to prove it then that then that statement
00:48:03.300
becomes false uh but you know it's true because it's unprovable in the system and roger perno says
00:48:11.540
you know because we have this knowledge that it is true by looking at it despite like we can't prove
00:48:17.460
it i mean the whole feature of the sentence is that it is unprovable um therefore our knowledge
00:48:25.780
is outside of any formal system therefore yes the human brain is or like our mind is understanding
00:48:32.340
something uh that mathematics is not able to give give it to us to describe to describe uh and i thought
00:48:41.620
i the first time i read it again i read a lot of these things like what's the famous you were telling
00:48:46.980
me last night i'd never heard it the bertrand russell self-canceling assertion yeah uh it's like
00:48:52.580
this statement is false it's called a lie it's called a liar paradox what explain why that's just
00:48:59.140
that's gonna float in my head forever why is that a paradox so this statement is false if you uh if you
00:49:05.300
look at a statement and agree with it then it becomes true but if it's true then it's not true it's false
00:49:10.900
and you go through the circular thing and you never stop right it broke logic in a way yes and bertrand
00:49:18.260
russell spent his whole you know big part of his life writing this book uh principia mathematica and
00:49:26.500
he wanted to really prove that mathematics is complete consistent just you know uh decidable
00:49:33.380
computable all of that um and then all these things happen girdles and completes theorem uh turing the
00:49:40.340
inventor of the computer actually this is the most ironic piece of science history that nobody ever
00:49:46.100
talks about but uh turing invented the computer to show its limitation so he invented the turing machine
00:49:53.140
which is the ideal representation of a computer that we have today all computers are turing machines
00:49:59.380
and he showed that um uh this machine uh if you give it a set of instructions it can't tell whether
00:50:07.940
those set of instructions will ever stop will run and stop or it will complete to a stop or will
00:50:14.260
continue running forever it's called the halting problem and this makes this proves that mathematics
00:50:19.700
have uh undecidability it's not fully decidable or computable so all of these things were happening as
00:50:26.180
he was writing the book and you know it was it was really depressing for him because he he kind of went
00:50:33.380
out to prove that you know mathematics is complete and all of that um and uh you know this caused kind
00:50:41.700
of a major panic uh at the time between mathematicians and all of that it's like oh my god like our
00:50:47.540
systems are not complete so it sounds like the deeper you go into science and the more honest you
00:50:54.180
are about what you discover the more questions you have yeah which kind of gets you back to where you
00:50:59.700
should be in the first place which is in a posture of humility yes and yet i see science used certainly
00:51:06.340
in the political sphere i mean those are all dumb people so it's like who cares actually kamala harris
00:51:10.260
lecturing me about science i don't even hear it but so also some smart people like believe the science
00:51:15.540
the assumption behind that demand is that it's complete and it's knowable and we know it and if
00:51:22.020
you're ignoring it then you're ignorant willfully or otherwise right well i my view of science it's a
00:51:27.460
method ultimately it's a method anyone can apply it's it's democratic it's decentralized anyone can
00:51:32.580
apply the scientific method including people who are not trained but in order to practice the method
00:51:37.140
you have to come from a position of humility yes that i don't know that's right i'm using this method
00:51:41.700
to find out and i cannot lie about what i observe right that's right and and today you know it's you
00:51:47.300
know the capital s science is used to control and it's used to um uh propagandize and lie
00:51:54.900
of course a bit you know in the hands of you know just really people who shouldn't have power
00:52:00.420
just dumb people with you know pretty ugly agendas but but we're talking about the world that you live
00:52:05.860
in which is like unusually smart people who do this stuff for a living and are really trying to advance
00:52:11.060
the ball in science and i think what you're saying is that some of them knowingly or not just don't
00:52:18.580
appreciate how little they know yeah and you know they go through this chain of reasoning for this
00:52:23.220
argument and you know none of those are at minimum uh you know a complete uh and like you know they
00:52:33.380
don't just take it for granted if you even doubt that the mind is a computer you're you know i'm sure
00:52:39.300
a lot of people will call me heretic and will call me like you know all sorts of names because it's just
00:52:44.740
dogma that the mind is a computer that the mind is a computer is dogma in technology science
00:52:50.900
that's so silly um yes well i mean let me count the ways the mind is different from a computer
00:52:58.260
first of all you're not assured of a faithful representation of the past memories change
00:53:02.980
over time right in a way that's misleading and who knows why but that is a fact right that's not true
00:53:09.220
of computers that's right i don't think yeah um but how are we explaining things like intuition
00:53:15.860
yeah yeah in instinct those are not well that is actually my question could those ever be features
00:53:21.700
of a machine you could argue that uh neural networks are sort of intuition machines and that's what a lot
00:53:29.060
of people say uh but neural networks you know and maybe i will describe them um just for the audience
00:53:36.740
uh neural networks are inspired by the brain um and the idea is that you can connect a network of small
00:53:46.820
little functions just mathematical functions and you can train it by giving examples you could give
00:53:53.140
it a picture of a cat and if it's yes you know let's say that this network has to say yes if it's a cat
00:53:59.780
no if it's not a cat so you give it a picture of a cat and then the answer is no then it's wrong
00:54:06.420
you adjust the weights based on the difference between the picture and the answer and you do this
00:54:12.500
i don't know a billion times and then the network encodes features about the cat uh and this is
00:54:21.380
literally exactly how neural networks work is is you tune all these small parameters until there's some
00:54:28.900
embedded um feature detection of you know especially in classifiers right um and this is not intuition
00:54:39.060
this is uh basically uh automatic programming the way i see it right of course so we we can write code
00:54:46.340
manually uh you can go to our website write code uh but we can generate algorithms um automatically
00:54:57.620
be a machine learning machine learning essentially discovers these algorithms and sometimes the
00:55:02.740
discovers like very crappy algorithms uh for example like you know you know all the pictures that we
00:55:09.140
gave it of a cat had grass in them so it would learn that grass equals cat the color green equals cat
00:55:18.500
yes and then you give it one day a picture of a cat without grass and it fails or like what happened
00:55:24.740
all turns out it learned the wrong thing so uh because it's obscure what it's actually learning
00:55:32.340
people interpret that as as intuition uh because it's not uh the algorithms are not uh as explicated
00:55:40.420
and there's a lot of work now on trying to explicate these algorithms which is great work for
00:55:44.900
companies like anthropic um but you know i don't think you can call it intuition just because it's obscure
00:55:54.740
so what is it how is intuition different human intuition um
00:56:03.300
we don't you know for one we don't require a trillion examples of cat to learn a cat
00:56:08.980
good good point um you you know a kid can learn uh language with very little examples
00:56:18.180
right now when we're training these large language models like chat gpt you have to give it the entire
00:56:22.660
internet for it to learn language and that's not really how humans work and the way we learn is
00:56:28.500
like we uh combine intuition and some more explicit way of of learning um and i don't think we've
00:56:37.460
figured out how to do it with machines just yet it do you think that structurally it's possible for
00:56:44.660
machines to get there so so so you know this this chain of reasoning um again i can go through every
00:56:57.060
point and present present arguments to the contrary or at least like present doubt but no one is really
00:57:01.620
kind of trying to deal with those doubts um and uh and uh my view is that i'm not holding these doubts
00:57:13.940
you know uh very very strongly but my view is that we just don't have a complete understanding of the
00:57:19.780
mind and you can't you at least can't use it uh to argue that a kind of machine that acts like a human but
00:57:28.500
much more powerful it can kill us all but you know do i think that machine uh you know ai can get
00:57:35.540
really powerful yes i think ai can get really powerful can get really useful i think functionally
00:57:41.540
can feel like it's general ai is ultimately a function of data the kind of data that we put into
00:57:47.860
it it's the functionality is based on this data so we can get very little functionality outside of that
00:57:54.340
actually we don't get any functionality outside of that data it's actually been proven that these
00:57:58.820
machines are just the function of their data the sum total of what you put in exactly garbage in
00:58:04.100
garbage out yeah uh the cool thing about them is they can mix and match different functionalities
00:58:10.260
that they learn from the data so it looks a little bit more general but let's say we collected all data
00:58:15.540
of the world we collected everything that we care about and we somehow fit it into a machine and now
00:58:20.420
everyone's building these really large data centers you will get a very highly capable machine that will
00:58:29.140
kind of look general because we collected a lot of economically useful data and we'll start doing
00:58:37.540
economically useful tasks and from our perspective it will start to look general so i'll call it
00:58:43.540
functionally agi i don't doubt we're sort of headed in some direction like that but but we haven't figured
00:58:51.140
out how these machines can actually generalize and can learn and can use things like intuition for when
00:58:57.700
they see something fundamentally new outside of their data distribution they can actually react to it
00:59:04.180
correctly and learn it efficiently we don't have the science for that so um because we don't have the
00:59:09.860
understanding of it yes on the most fundamental level you began that explanation by saying we
00:59:14.180
don't really understand the human brain so like how can we compare it to something because we don't
00:59:17.220
even really know what it is and there are a couple uh there's a science there's a machine learning
00:59:21.220
scientist francois chalet i don't know how to pronounce uh french names but i think that's his name
00:59:26.900
he um he he took a sort of an iq like test you know where you're rotating shapes and whatever
00:59:32.260
and uh an entrepreneur put a million dollars uh for anyone who's able to solve it using ai
00:59:40.660
and all the modern ais that we think are super powerful couldn't do something that like a 10 year
00:59:46.420
old kid could do and it showed that again those machines are just functions of the data the moment
00:59:53.300
you throw a problem that's novel at them they really are not able to do it now again i used i'm
00:59:59.540
not fundamentally discounting the fact that maybe we'll get there but just the reality of where we
01:00:04.980
are today you can't argue that we're just going to put more compute and more data into this and
01:00:10.180
suddenly it becomes god and kills us all because because that's the argument and they're you know
01:00:14.820
going to dc and they're going to all these places that are springing up regulation this regulation is
01:00:19.700
going to hurt and make american industry it's going to hurt startups it's going to make it hard to
01:00:23.700
compete it's going to give china a tremendous uh um advantage uh and it's going to really hurt us
01:00:30.500
based on these flawed arguments that they're not actually battling with these real questions it
01:00:36.500
sounds like they're not and what gives me pause is not um so much the technology it's the way that the
01:00:41.540
people creating the technology understand people so i think the wise and correct way to understand people
01:00:48.020
is as not self-created beings people did not create themselves people cannot create life as
01:00:54.900
beings created by some higher power who at their core have some kind of impossible to describe spark
01:01:02.420
a holy mystery and for that reason they cannot be enslaved or killed by other human beings that's wrong
01:01:09.860
there is right and wrong that is wrong i mean lots of gray areas that's not a gray area
01:01:13.300
because they're not self-created yes right um i think that all humane action flows from that belief
01:01:22.020
and that the most inhumane actions in history flow from the opposite belief which is people are just
01:01:27.380
objects that can and should be improved and i have full power over them like that's a real that's
01:01:33.140
a totalitarian mindset and it's the one thing that connects every genocidal movement is that belief so
01:01:38.340
i it seems to me as an outsider that the people creating this technology have that belief yeah
01:01:43.140
and you don't even have to be spiritual to have that belief look i um you certainly you certainly
01:01:49.140
don't yeah yeah so i think i think that's actually a rational conclusion based on 100 agree i'll give
01:01:54.180
you one interesting anecdote again from science we've had brains for half a billion if you believe in
01:01:59.460
evolution all that we have had brains for half a billion uh years right um and we've had kind of a human
01:02:07.140
uh like species uh for you know you know half a million year perhaps more perhaps a million years
01:02:18.340
there's a moment in time 40 000 years ago it's called the great leap forward where we see culture we see
01:02:27.540
religion we see drawings we see we saw like very little of that before that tools and whatever and
01:02:34.420
suddenly we're seeing this this cambrian explosion of culture right and pointing to something larger
01:02:41.940
than just like daily needs or the world around them but yeah and it's not we're not we're still not able
01:02:47.540
to to to explain it you know david reich wrote this book it's called i think who we are where we came
01:02:53.300
from in it he talks about uh trying to look for that genetic mutation that happened that potentially
01:03:00.340
created this this explosion and they they have some idea of what it could be and some candidates but
01:03:06.180
they don't really have it right now but you have to ask the question like what happened 30 or 40 000
01:03:11.940
years ago right where it's clear i mean it's indisputable that the people who lived during that
01:03:18.420
period were suddenly grappling with metaphysics yes they're worshiping things there's a clear
01:03:23.940
clear separation between between again the animal brain and the human brain uh and it's clearly not
01:03:32.180
computation like we suddenly didn't like grow a computer in a rain it's it's something else happened
01:03:38.500
but what's so interesting is like the instinct of modern man is to look for something inside the
01:03:43.140
person that caused that whereas i think the very natural and more correct instinct is to look for
01:03:47.780
something outside of man that caused that i'm open to both yeah i mean i i don't know the answer
01:03:52.740
i mean of course i do know the answer but um but i'll just pretend i don't but at very least both
01:03:59.700
are possible so if like you can find yourself to looking for a genetic mutation or change change
01:04:06.340
then you know you're sort of closing out that's not an empiricist a scientific way of looking at
01:04:11.140
things actually you don't foreclose any possibility right yeah science you can't right interesting yeah
01:04:17.380
yeah i know that's uh that's very interesting uh so you know um i i think that uh these machines
01:04:23.700
i'm betting my business that on ai getting better and better and better and it's gonna uh make us all
01:04:31.940
uh better it's gonna make it all more educated okay so okay yeah now now's the time for you to tell
01:04:38.260
me why i should be excited about something i've been hearing yeah so um uh uh this technology large
01:04:48.020
language models where we kind of fed uh a neural network the entire internet and it has capabilities
01:04:57.460
mostly around writing around uh information lookup around summarization around coding uh it does a lot of
01:05:06.100
really useful thing and you can program it to kind of pick and match between these different skills
01:05:11.140
you can program these skills using code um and so the kind of products and services that you can build
01:05:17.460
with this um are amazing so one one of the things i'm most excited about this application of the technology
01:05:25.300
um there's this problem called the bloom's two sigma problem uh there's this uh you know uh scientist
01:05:32.900
that was studying education and um he was looking at different interventions to try to get you know
01:05:40.980
kids to learn better or faster or have just better educational outcomes and he found something kind of
01:05:48.260
bad which is there's only one thing you could do to move kids not in a marginal way but in a two
01:05:58.180
standard deviations from the norm like in a big way like better than 98 of the other kids um by doing
01:06:07.620
one-on-one tutoring using a type of learning called mastery learning uh one-on-one turing is the is the
01:06:14.820
key formula there that's great i mean we discovered the solution to education we can up level everyone all
01:06:22.500
humans on earth the problem is like we don't have enough your teachers to do one-on-one touring it's
01:06:28.820
very expensive you know no country in the world can afford that so now we have these machines that can
01:06:35.860
talk that can teach that can um they can present information uh that you can interact with it in a very
01:06:43.940
human way you can talk to it it can talk to you back right and we can build um ai applications to
01:06:51.860
teach people one-on-one and you can teach you can have it you can serve seven billion people
01:06:59.940
with that and and we can everyone can get smarter i'm totally for that i mean that was the promise
01:07:05.460
the internet didn't happen so i hope this um i'm was gonna save this for last but i can't control myself
01:07:13.140
so i just know being from dc that when uh the people in charge see new technology the first thing
01:07:19.780
they think of is like how can i use this to kill people um so what are the military applications
01:07:25.620
potentially of this technology you know that's one of the other thing that i i i'm sort of very
01:07:30.180
skeptical of this lobbying effort to get government to um to regulate it because like i think the biggest
01:07:36.820
offender would be of abuse of this technology probably government you think you know i watched your
01:07:42.580
interview with uh jeffrey sax um who's like a columbia professor very very mainstream um and uh i think
01:07:52.580
he got assigned to like a lancet um uh sort of study of covid origins or whatever and he he arrived at
01:07:59.860
very at the time heterodox view that it was created in a lab and was created by u.s government um and
01:08:07.780
and and and so you know the the government is supposed to protect us from these things and
01:08:12.980
now they're talking about pandemic readiness and whatever well let's talk about let's talk about
01:08:18.500
how do we watch what the government is doing how do we actually have democratic processes to ensure
01:08:25.540
that you're you're not the one abusing these technologies yes because they're going to regulate
01:08:29.620
it they're going to make it so that everyday people are not going to be able to use these things
01:08:33.380
and then they're going to have free reign on how to you know how to abuse these things just like
01:08:37.620
with encryption right encryption is another one that's right but they've been doing that for decades
01:08:42.900
yes like we get privacy but you're not allowed it because we don't trust you right but by using your
01:08:47.780
money and the moral authority that you gave us to lead you we're going to hide from you everything
01:08:53.860
we're doing and there's nothing you can do about it yeah i mean that's the state of america right now
01:08:57.460
yeah yeah so how would they use ai to further oppress us i i mean you can use it in all sorts of
01:09:04.260
ways like uh autonomous drones we already have autonomous drones they get a lot a lot worse you
01:09:10.420
can uh you know there's a video on the internet where like the you know uh chinese guard or whatever
01:09:16.500
was walking with a dog with a robotic dog and the robotic dog had a gun mounted to it
01:09:21.380
uh and so you can have robotic set of dogs with shooting guns a little sci-fi like you can make
01:09:28.980
it's a dog lover that's so offensive to me it is kind of offensive yeah
01:09:33.940
in a world increasingly defined by deception and the total rejection of human dignity we decided to
01:09:39.700
found the tucker carlson network and we did it with one principle in mind tell the truth you have a god
01:09:46.900
given right to think for yourself our work is made possible by our members so if you want to enjoy an
01:09:53.540
ad-free experience and keep this going join tcn at tucker carlson.com slash podcast tucker carlson.com
01:10:01.460
there was this huge expose in this magazine called 972 about how israel was using um ai to target
01:10:25.300
uh suspects but ended up killing huge numbers of civilians it's called the lavender
01:10:30.500
a very interesting piece um so the the technology wound up killing people who were not even targeted yes
01:10:52.500
boom i think it could be used for service i i'm not sure if it gives a special advantage i think the
01:10:59.540
they can get the advantage by again if these lobbying groups are successful part of their
01:11:06.500
you know their ideal outcome is to is to make sure that no one is training uh large language models and to
01:11:14.420
do that you would need to insert uh surveillance apparatus at the compute level and um and so
01:11:22.340
perhaps that's very dangerous our computers would like spy on us to make sure we're not training
01:11:26.580
ais um i i think you know the kind of ai that's really good at surveillance is kind of the vision ai
01:11:33.380
which china sort of perfected so that's been around for a while now i you know i'm sure there's ways to
01:11:40.100
abuse language models for for surveillance but i can't think of it right now what about manufacturing
01:11:47.620
um it would help with manufacturing right now people are figuring out how to do um
01:11:52.900
uh i invested in a couple companies they uh how to apply this technology foundation models to robotics
01:12:00.740
um it's still early science but you might have a huge advancement and robotics
01:12:06.740
uh if we're able to apply this technology to it so the whole point of technology is to replace human
01:12:12.660
labor either physical or mental i think i mean historically that's what you know the steam engine
01:12:18.020
replaced the arm etc etc so if this is as transformative as it appears to be you're going to have a lot of
01:12:27.460
idle people and that's i think the concern that led a lot of your friends and colleagues to support
01:12:32.420
ubi universal basic income like there's nothing for these people to do so we just got to pay them to
01:12:37.460
exist you said you're opposed to that i'm adamantly opposed to that on the other hand like what's the
01:12:44.020
answer yeah so you know uh there's there's two ways to look at it we can look at the individuals that
01:12:50.020
are losing their jobs which is tough and hard i don't really have a good answer but we can look
01:12:54.820
at it from a macro perspective and when you look at it from that perspective for the most part
01:13:00.820
technology created more jobs over time you know a you know before alarm clocks we had this job called
01:13:08.420
the knocker-opper yeah which goes to your room you kind of pay them it was like come every day at like
01:13:13.300
five yeah they knock on your or ring the village bell right yeah and you know that job disappeared but
01:13:19.620
like we we had you know 10 times more jobs in manufacturing or perhaps you know 100 or 1000
01:13:26.420
more jobs in in manufacturing and so overall i think the general trend is technology just creates
01:13:34.420
more jobs and so like i'll give you a few examples how ai can create more jobs actually you can create
01:13:40.100
more interesting jobs um entrepreneurship is like a very american thing right it's it's like america is the
01:13:48.260
entrepreneurship country but actually new firm creation has been going down for a long time at
01:13:53.220
least 100 years it's just like been going down although we have all this excitement around startups
01:13:58.340
or whatever uh silicon valley is the only place that's still producing startups like the rest of
01:14:03.300
the country there isn't as much startup or new firm creation which is kind of sad because again the
01:14:09.700
internet was supposed to be this you know great wealth creation engine that anyone has access to
01:14:14.260
but the way it turned out is like it was concentrated in this one geographic area well it looked i mean
01:14:20.020
in retrospect looks like a monopoly generator actually yeah but again it doesn't have to be that way and
01:14:25.860
and the way i think ai would help is that it will give people the tools to start businesses because you
01:14:33.300
have this easily programmable machine that can help you with programming i'll give you a few examples we
01:14:38.740
there's a teacher in denver that you know during covet was was a little bored went to our website we
01:14:45.060
have a free course to to learn how to code and uh he learned a bit of coding and uh he used his knowledge
01:14:51.940
as a teacher to build an application that helps teachers use ai to teach and within a year he built a
01:15:00.740
business that's worth tens of millions of dollars that's bringing in a huge amount of money i think he raised 20
01:15:06.580
million dollars uh and that's a teacher who learned how to code and created this massive business really
01:15:13.460
quickly we have stories of photographers doing millions of dollars in revenue uh so it just it's a
01:15:21.140
it's a you know ai will decentralize access to this technology so there's a lot of ways in which
01:15:27.940
you're right technology tend to centralize but there's a lot of ways that people kind of don't really
01:15:32.820
look at in which technology can decentralize well that was i mean that promise makes sense to me i
01:15:37.620
i would just i fervently want it to become a reality i have a we have a mutual friend who show
01:15:41.700
my name was so smart and a good humane person um who was very way up into the subject and participates
01:15:50.900
in the subject and he said to me well one of the promises of ai is that it will allow people to have
01:15:57.860
virtual friends or mates that it will feel you know it will solve the loneliness problem
01:16:05.220
that is clearly a massive problem in the united states and i felt like i don't want to say it
01:16:09.540
because i like him so much but that seemed really bad to me yeah i'm not interested in those
01:16:16.980
i think we have the same intuition about about you know what's what's dark and dystopian versus what
01:16:22.580
and he's a wonderful person but i may i just don't think he's thought about it or i don't know what
01:16:27.060
but we disagree but why just i don't even disagree i don't have an argument just an instinct but like
01:16:31.940
people should be having sex with people not machines right that's right um like i i would go so far as
01:16:39.540
to say some of these applications are like a little unethical like the you know preying on sort of
01:16:44.260
lonely lonely men with no uh with no with no opportunities for for a mate and uh like you
01:16:52.180
know it will make it so that they were actually not motivated to go out and yes and date and get
01:16:57.540
an extra girlfriend like porn 10x yes yes and i think that's really bad that's really bad for society
01:17:03.460
uh and so i think the application look you can apply apply this technology in a positive way or you
01:17:07.700
can apply the negative way you know i would love for this you know doom cult if instead they were
01:17:13.860
like trying to you know make it so that ai is applied in a positive way if we had a cult that
01:17:19.380
was like oh we're gonna lobby we're gonna uh sort of go out and uh you know uh make it make it so
01:17:28.260
that um you know ai is a positive technology i'd be all for that and by the way there are in history
01:17:34.740
there are you know times where the culture self-corrects right i think there's some self-correction on
01:17:40.900
porn that's happening right now um you know uh fast food right i mean if you know just generally
01:17:47.380
junk you're right uh you know everyone is like whole foods is like high status now like you you
01:17:52.820
eat all foods there's a place called foods you can go to that's right and people are interested in
01:17:57.540
eating healthy and chemicals in the air and water another thing that was a very esoteric concern even
01:18:02.660
10 years ago was only the wackos it was bobby kennedy cared about that no one else did now that's like
01:18:07.220
a feature of normal conversation yes everyone's worried about microplastics and the testicles
01:18:12.660
that's right yeah which is i think a legitimate concern absolutely so what i'm not surprised that
01:18:17.540
there are cults in silicon valley i don't think you named the only one i think there are others that's
01:18:20.980
my sense and i'm not surprised because of course every person is born with the intuitive knowledge
01:18:26.980
that there's a power beyond himself that's why every single civilization has worshiped something
01:18:32.500
and if you don't acknowledge that you just it doesn't change you just worship something even
01:18:35.780
dumber yeah but so my question to you as someone who lives and works there is what percentage of
01:18:40.260
the people who are making decisions in silicon valley will say out loud you know not i'm a christian jew
01:18:45.140
or muslim but that like i'm not you know there is a power bigger than me in the universe do people
01:18:51.380
think that do they acknowledge that you know for the most part no um i thought yeah like i think most
01:18:59.940
i don't want to say most people but like you know the vast majority of the discussions seem to be
01:19:05.300
like more intellectual i think people just take for granted that everyone has like a secular mostly
01:19:10.420
secular point of view well i think that you know that the truly brilliant conclusion is that we don't
01:19:16.900
know a lot and we don't have a ton of power that's my view something right right so like the actual
01:19:21.940
intellectual will over time if he's honest will read this is the view of like many scientists and many
01:19:27.460
people who really went deep i mean i don't know who said i'm trying to remember but someone said
01:19:31.380
like the first gulp of science make you an atheist but at the bottom of the cup uh you'll find god
01:19:38.980
waiting for you that's matthias desmet wrote a book about this supposedly about covet it was not about
01:19:44.260
covet i just cannot recommend it uh more strongly but the book is about the point you just made which is
01:19:50.340
the deeper you go into science the more you see some sort of order reflected that is not random at
01:19:58.020
all yes and a beauty exhibited in um in math even uh and the less you know and the more you're certain
01:20:07.700
that this is but there's a design here and that's not human or quote natural it's supernatural that that's
01:20:15.380
his conclusion and i affirm it but how many people do you know in your science world who think that
01:20:25.220
yeah i can count them on on one hand basically how interesting yeah that concerns me because i
01:20:31.220
feel like without that knowledge hubris is inevitable yeah and and you know a lot of these conclusions
01:20:36.980
are from hubris like the the fact that you know there's so many people that believe that ai is an
01:20:42.420
eminent existential threat a lot of people believe that but we're gonna die we're all gonna die in
01:20:46.500
the next five years comes from that hubris how interesting i've never until i met you i've never
01:20:53.540
thought of that that actually that is itself an expression of hubris i never thought of that
01:21:00.820
yeah you can go negative with hubris you can go positive and we're gonna and i think the positive
01:21:06.340
thing is is good like i think elon is an embodiment of that it's like just a self-belief that you can
01:21:11.700
like fly rockets and build electric cars is good and maybe in some cases it's delusional but like
01:21:17.620
net net will kind of put you in a on a on a good path for for creation i think it can go pathological if
01:21:24.100
you um if you you know if you're for example spf and again he's he's kind of part of those groups
01:21:30.500
um just sort of believed that he can do anything in service of his ethics uh including steal and
01:21:40.340
cheat and all of that yeah i don't i never really understood well of course i understood too well i
01:21:45.780
think but the the obvious observable fact that effective altruism led people to become shittier
01:21:55.620
toward each other not better yeah i mean it's such an irony but i feel like it's in the name
01:22:04.020
if you call yourself such grandiose thing you're typically yeah horrible like the islamic state
01:22:11.860
is not neither islamic or state the effective altruists are neither the united nations it's not
01:22:18.420
united no that's boy is that wise so i don't think to your earlier point that any
01:22:25.620
large language model or machine could ever arrive at what you just said the iron because
01:22:33.220
like the deepest level of truth is wrapped in irony always and i don't machines don't get irony right
01:22:39.700
not yet could they um maybe i i mean i i don't think i don't take as strong of a stance as as you are
01:22:47.940
at like the you know the capabilities of the machines i do believe that you know if you represent
01:22:53.060
it well i don't know i mean i'm asking i really don't know what they're capable of well i i think
01:22:57.540
maybe they can't come up with real novel irony that is like really insightful for us but if you put a
01:23:04.180
lot of irony in the data they'll understand right they can ape human iron they can ape i mean they're
01:23:09.060
ape machines they're imitation machines they're literally imitating like you know the way large
01:23:15.540
language models are trained is that you give them a corpus of text and they hide different words and
01:23:20.420
they try to guess them and then they adjust the weights of those neural networks and then
01:23:25.140
eventually they get really good at guessing what humans would say well then okay so you're just kind
01:23:30.020
of making the point unavoidable like if if the machines as you have said it makes sense are the
01:23:35.620
sum total of what's put into them yeah then and that would include the personalities and biases of the
01:23:41.460
people yes putting the data in that's right then you want like the best people the morally best
01:23:47.220
people which is to say the most humble people to be doing that but it sounds like we have the least
01:23:51.540
humble people doing that yeah i think some of them are humble like i wouldn't like i think some people
01:23:56.580
working in ai are really upstanding and and good and want to do the right right thing but there are a
01:24:01.940
lot of people with the wrong motivations coming at it from fear and things like that
01:24:06.260
look this is the other point i will make is that um you know free markets are are good because you're
01:24:12.420
going to get all sorts of entrepreneurs with different motivations uh and and and i think
01:24:18.820
what's what um what determines that the winner is not always the ethics or whatever but it's the
01:24:24.500
larger culture like what what is the what kind of product is pulling out of you if they're pulling
01:24:29.220
the porn and the uh you know companion chat bots whatever versus they're pulling the education
01:24:37.060
and the health care and i think all the positive things that will make uh our life better i think
01:24:44.100
that's really on the on the larger culture i don't think we can regulate that with government or whatever
01:24:50.020
but if the culture creates demand for things just makes us worse as humans then there are entrepreneurs
01:24:59.140
that will spring up and serve this no that's totally right and it is a snake eating its tail at some
01:25:05.300
point because of course you know you you serve the baser human desires and you create a culture
01:25:13.780
that inspires those desires in a greater number of people in other words the more porn you have
01:25:17.540
the more important people want like actually yes
01:25:19.860
i wonder about the pushback from existing industry from the guilds
01:25:38.180
so like if you're the ama for for example you mentioned medical advances that's something that makes
01:25:42.820
sense to me it for for diagnoses which really is just a matter of sorting the data like what's most
01:25:48.900
likely that's right and a machine can always do that more efficiently and more quickly than any
01:25:53.780
hospital or individual doctor so like and diagnosis is like the biggest hurdle yes um that's gonna like
01:26:03.220
that's gonna actually put people out of business right if i can just type my symptoms into a machine
01:26:08.340
and i'm getting a much higher likelihood of a correct diagnosis than i would be after three
01:26:13.780
days the mayo clinic like who needs the mail i actually have a concrete story about that i've
01:26:17.700
i've dealt with like a chronic issue for a couple years uh i spent hundreds of thousands of dollars on
01:26:23.780
on doctors out of pocket get like world's experts and all hundreds of thousands of yes and they
01:26:30.900
couldn't come up with a right diagnosis and eventually it took me like writing a little bit of
01:26:35.940
software to collect the data or whatever but i ran it i ran the ai i used the ai i ran the ai once
01:26:41.460
and it gave me a diagnosis they haven't looked at and i went to them they were very skeptical of it
01:26:45.620
and then we ran the test turns out it was the right diagnosis oh that's incredible yeah it's amazing it
01:26:50.500
changed my life that's incredible but you had to write the software to get there yeah a little bit
01:26:55.380
of software so that's just we're not that far from like having publicly available right and by the
01:27:01.220
the way i i think that anyone can write a little bit of software right right now at replet we are
01:27:05.460
working on a way to generate most of the code for you we have this program called 100 days of code
01:27:11.700
if you give it 20 minutes you know do a little bit of coding every day in like three months you'll be
01:27:19.060
good enough coder to build uh you know a startup i mean eventually you'll get people working for you
01:27:24.420
and you'll upscale and all that but you'll have enough skills and in fact you know i'll put up a
01:27:28.980
challenge out there people listening to this if they go through this and they build something that
01:27:34.260
they think could be a business whatever i'm willing to help them get it out there promoted we'll give
01:27:38.740
them some credits and cloud services whatever just you know tweet at me or something and mention this
01:27:43.860
this podcast and i'll help what's your twitter amasad a-m-a-s-a-d
01:27:51.140
so but there are a lot of entrenched interests i mean i don't want to get into the whole
01:27:55.060
covid poison thing but i'm revealing my biases um but i mean you saw it in action during covid where
01:28:03.860
you know it's always a mixture of motives like i do think there were high motives mixed with low
01:28:08.100
motives because that's how people are you know it's always a buoy base of good and bad but um but to some
01:28:13.860
extent the profit motive prevailed over public health yes that is i think fair to say yes and so they if
01:28:20.500
they're willing to hurt people to keep the stock price up they i mean what's the resistance you're
01:28:26.100
going to get to allowing people to come to a more uh accurate diagnosis with a machine for free yeah
01:28:36.420
so in in some sense that's why i think open source ai people learning how to do some of the stuff
01:28:43.300
themselves is uh is is probably good enough uh of course if there's a company that's building these
01:28:50.900
services it's going to do better but just the fact that this ai exists and a lot of it is open
01:28:56.500
source you can download it on your machine and use it is enough to potentially help a lot of people
01:29:02.580
by the way you should always talk to your doctor i talked to my doctor i'm not giving people advice
01:29:06.420
kind of figure out all this themselves but i do think that uh it's already empowering so that that's
01:29:13.140
sort of step one but for someone like me i'm not going to talk to a doctor until he apologizes to my
01:29:17.300
face for lying for four years right because i have no respect for doctors at all i have no respect for
01:29:22.100
anybody who lies period and i'm not taking life advice and particularly important life advice like
01:29:28.820
about my health from someone who's a liar i'm just not doing that because i'm not insane i don't take
01:29:32.260
real estate advice from homeless people i don't take financial advice from people who are going to jail
01:29:36.100
for fraud so like um i'm sure there's a doctor out there who would apologize uh but i haven't met
01:29:43.620
one yet so for someone like me who's just i'm not going to a doctor until they apologize uh this could
01:29:49.620
be like literally life-saving right so um to the question of whether there's going to be a regulatory
01:29:55.780
capture i think that the the the that's i mean that that's that's why you see silicon valley getting into
01:30:03.620
politics hmm you know a silicon valley uh you know what was always sort of in a politics that you
01:30:11.140
know when i was i remember i came in 2012 it was uh uh early on in my time there was it was the romney
01:30:19.540
obama debate and i was can i just pause yeah imagine a debate between romney and obama who agree on
01:30:28.020
everything yes uh i i didn't see a lot of daylight and and people were just like making fun of romney
01:30:34.820
it's like it was like he said something like binders full of women and kind of that stuck without
01:30:38.660
whatever um and i remember asking everyone around me like who who are you with i was like of course
01:30:45.060
democrats like of course uh it was like why why isn't anyone here for republicans and they're like
01:30:52.100
like oh because they're dumb you know only dumb people are gonna but we're republicans uh and you
01:30:58.500
know silicon valley was this like one state town in a way uh actually look you know there's like um
01:31:07.060
data on like donations by company for for state there's like netflix is 99 percent to democrats and
01:31:15.140
like one percent to uh republicans if you look up the you know diversity of parties in north korea it's
01:31:21.540
actually a little better oh of course it is more choices they have a more honest media too but
01:31:27.060
anyways i mean you see now a lot of people are surprised that a lot of um you know people in
01:31:31.220
tech are going for republicans uh going for trump um and uh particularly mark andreason and ben harwitz
01:31:39.140
uh put out a two-hour podcast talking about they are the biggest venture capitalists in the united states
01:31:44.500
i think uh i don't know on what metric you would you would judge but they're certainly on on their way
01:31:49.540
to be the biggest they're the most i think the best for sure um and uh and they put out what was
01:31:58.980
their i i didn't i should have watched i didn't yeah so they're reasoning for why they would vote for
01:32:05.380
trump by the way you know they would have never done that in like 2018 or 19 whatever uh
01:32:13.220
uh and uh so this this vibe shift that's happening and how is it received uh it's still it's still
01:32:21.300
mixed but but i think you know way better than what would have happened 10 years ago they would
01:32:25.780
have been canceled and they would have no one would ever like no founder would take their money
01:32:29.620
but it's like and i mean again i'm an outsider just watching but andreason harwitz is so big and so
01:32:35.220
influential and they're considered smart and not at all crazy yeah that like that that's gotta change
01:32:42.180
minds if andreason harwitz is doing it yeah yeah it will have certainly changed minds um i think a lot
01:32:48.100
i think you know give people some courage to say i'm i for trump as well at minimum but i think it
01:32:54.820
does change my and they put out the arguments is um you know they put out this agenda called little
01:33:00.180
tech you know there's big tech and they have their lobbying and whatever who's lobbying for little
01:33:04.500
tech like smaller companies companies like ours but much smaller too like you know one two person
01:33:09.700
companies and actually no one is your company would be considered little in silicon valley uh
01:33:18.660
but i want a little company right um so uh but you know scott like really your startups that just
01:33:26.260
started right like you know typically no one is um protecting them sort of politically no one is
01:33:33.620
really thinking about it and it's very easy to disadvantage startups like you just talked about with
01:33:37.940
health care regulation or whatever very easy to create regulators or capture such that companies
01:33:43.780
can't even get off the ground doing their thing um and so they came up with this agenda that like
01:33:49.700
we're going to be the you know the firm that's that's going to be looking out for for that little guy
01:33:55.460
the little tech right which i think is brilliant and you know part of their argument uh for trump is
01:34:01.140
that you know uh the um you know ai for example like the the democrats are really excited about regulating ai
01:34:11.780
um one of the most hilarious things that happened i think uh um kamala harris was uh invited to
01:34:20.100
ai safety conference and they were talking about existential risk and she was like well someone being
01:34:26.980
denied healthcare that's existential for them someone uh whatever that's existential so she
01:34:32.740
interpreted existential risk as like any risk is is existential and so you know that's just one
01:34:37.780
anecdote but like yeah there was this anecdote where she was like ai it's a two-letter word and
01:34:42.820
you clearly don't understand it very well and they're they're moving very fast at at regulating it
01:34:48.100
they put out an executive order that a lot of people think they kind of i mean the the tweaks
01:34:53.300
they've done so far from a user perspective to keep it safe or really like just making sure it
01:34:58.180
hates white people like it's about pushing a dystopian totalitarian social agenda racist social
01:35:06.420
agenda on the country like can is that going to be embedded in it permanently um i think it's a
01:35:11.940
function of the culture rather than the regulation so i think uh the culture was sort of this woke
01:35:17.620
culture uh broadly right in america but certainly in silicon valley and now that the vibe shift is
01:35:25.140
happening i think i think microsoft just fired their dei team microsoft yeah uh i mean it's a huge
01:35:32.500
vibe are they gonna learn to code do you think microsoft perhaps um so uh you know the uh you know
01:35:42.980
i wouldn't pin this on the government just yet but it's very easy oh no no no i i just spent
01:35:46.740
democratic members of congress i know for a fact applied pressure oh they did to the labs like
01:35:51.380
no you can't it has to reflect our values okay yeah yeah so maybe that's where it is that permanent
01:35:57.700
am i always going to get when i type in who is george washington you know a picture of denzel
01:36:02.020
washington you know it's already changing is what i'm saying it's already a lot of these things are
01:36:06.500
being reversed it's not perfect but it's already changing and that's i think it's just a function of
01:36:11.460
the larger culture culture change yeah i think elon buying twitter uh is uh in letting people talk and
01:36:19.700
debate moved the culture to like i think a more moderate place i think he's gone a little more uh
01:36:28.180
you know a little further but like i think that it was net positive on the culture because it was so
01:36:34.820
far left uh it was so far left inside these companies the way they were designing their products
01:36:40.900
such that you know george washington will look like there's like a black george washington yeah
01:36:45.380
have you uh that's just insane right it was like it was verging on insanity well it's lying and that's
01:36:51.300
what freaked me out i mean it's like i don't know just tell the truth there are lots of truths i don't
01:36:54.900
want to hear that don't comport with my you know desires but i don't want to be lied to george
01:37:00.340
washington was not black none of the framers were they were all white protestant men sorry
01:37:04.340
all right yeah so like that's a fact deal with it so if you're gonna lie to me about that um you're
01:37:10.500
my enemy right i think so i mean you're uh and i would say it's a small element of these companies
01:37:15.460
that are doing that yes but they tend to be the control they were the controlling element those like
01:37:21.060
sort of activist folks that were and i was at facebook in 2015 uh and worked at facebook i worked
01:37:28.260
at facebook yeah i didn't know that i worked on open source mostly i worked on react and react
01:37:33.300
native one of the most powerful kind of wave programming user interfaces so i i mostly worked
01:37:39.060
on that i didn't really work on the kind of blue app and and all of that um but i saw the sort of
01:37:45.860
cultural change where like a small minority of activists were just like shaming anyone who is
01:37:51.700
thinking independently and it sent silicon valley in this like sheep-like direction where everyone
01:37:59.780
is afraid of this activist class uh because they can cancel you they can you know uh i think one of
01:38:07.620
the early shots fired there was like brandon ike the inventor of javascript the inventor of the
01:38:12.980
language that like runs the browser uh because the way he votes or donates whatever get get fired from
01:38:20.180
his position as cto of mozilla browser and that was like seen as a win or something and i was like
01:38:28.900
again i was like very politically you know i was not really interested in politics in like 2012 13 when
01:38:34.340
i first came to this country but i just accepted as like oh you know all these people democrats
01:38:39.620
liberal is what you are whatever but i just looked at that i was like that's awful like you know no matter
01:38:47.140
what his political opinion is like you know you're you're you're taking from a man his ability to earn
01:38:54.020
a living eventually he started another browser company and it's good right but uh this like sort
01:38:59.780
of cancel culture created such a bubble of conformism and the leadership class at these companies were
01:39:07.540
actually afraid of the employees so that is the fact that bothers me most silicon valley is defining our
01:39:12.980
future it that is technology we don't have kind of technology in the united states anymore manufacturing
01:39:19.140
creativity has obviously been extinguished everywhere in the visual arts you know everywhere yeah silicon
01:39:23.700
valley is the last yes place important what's the most important yes and so the number one requirement
01:39:31.220
for leadership is courage number one yes number one nothing even comes close to bravery um as a requirement
01:39:36.820
for wise and effective leadership so if the leaders of these companies are afraid of like 26 year
01:39:42.100
old unmarried screechy girls in the hr department like whoa that's really cowardly like shut up
01:39:51.460
you're not leading this company i am like that's super easy i don't know why that's so hard like what
01:39:58.340
the reason i think it was hard it was uh because these companies were competing for talent hand over
01:40:05.620
fist and it was the sort of zero interest era and and sort of uh u.s economy that's right and everyone
01:40:13.700
was throwing cash at like talent and therefore if you offend the sensibilities of the employees even to
01:40:19.700
the you know slightest bit uh you're afraid that they're they're gonna leave or something like that i'm
01:40:26.500
trying to make up an excuse for them but well you would you could answer this question because you are
01:40:30.900
the talent that you know you came all the way from jordan yeah to work in the bay area to be at the
01:40:36.660
center of creativity inside so um the people who do what you do who can write code yeah the basis of
01:40:45.460
all of this are they i don't like they seem much more like you or james demore they just they don't seem
01:40:53.220
like political activists to me for the most part yeah they're they're still uh a segment of the sort of
01:40:58.740
programmer population well they have to be rational because code is about reason right
01:41:04.100
nah i mean this is the whole thing you know it's like i don't think i mean a lot of these people
01:41:07.780
that we talked about are into code and things like that they're not rational really yeah like look i
01:41:12.420
think coding could help you become more rational but you can very easily override that i think that's
01:41:17.380
the basis of it i thought oh if this is true and that is true then that must be true i thought that
01:41:22.180
was the yeah but you know people are very easy very it's very easy for people to just uh you know
01:41:27.780
compartmentalize right i was like now i'm doing coding now i'm doing emotions oh so the brain is
01:41:33.060
not a computer the brain is not exactly exactly that's my point i know you know when so you know
01:41:40.020
i i'm probably you know responsible for the most amount of people learning to code in america because
01:41:45.220
i i was like a um i like built the reason i came to the us is i built this piece of software that
01:41:51.380
was the first to make it easy to uh code in the browser and it went super viral and a bunch of
01:41:57.700
u.s companies started using him including uh code academy uh and i joined them as like a founding
01:42:05.300
engineer they had just started two guys amazing guys that just started and i joined them and we
01:42:09.940
taught like 50 million people how to code you know many of them many millions of them are american
01:42:14.980
um and you know the the uh sort of rhetoric at a time what you would say is like coding is important
01:42:20.580
because you know it'll teach you how to think computational thinking and and all that i sort of like not
01:42:26.580
maybe i've said it at some point but i've never really believed it i think coding is a tool you
01:42:31.780
can use to build things to automate things to it's a fun tool you can do art where they can do a lot of
01:42:37.060
things with it but ultimately i don't think you know you can you know sit people down and sort of
01:42:43.700
make them more rational um and you get into all these weird things if you try to do that you know
01:42:50.180
people can become more rational by virtue of education by virtue of seeing that you know
01:42:56.740
taking a more rational approach to um to their life yields results but you can't you can't like
01:43:04.660
really teach it that way well i agree with that completely that's interesting i just thought it was
01:43:10.180
a certain because i i have to say without getting into controversial territory um every person i've ever
01:43:15.540
met who writes code like is kind of similar in some ways to every other person i've ever met who
01:43:19.620
writes code like it's not true yeah not a broad cross-section of any population no at all well
01:43:27.540
people who make it a career but i think anyone sort of can write a little bit of code i'm sure i mean
01:43:31.300
people who get paid to do it right right yeah um interesting so bottom line do you see and then
01:43:36.900
we didn't even mention elon musk david sachs um have also come out for trump so do you think
01:43:45.540
the vibe shift in silicon valley is real yes actually i would credit sachs in originally like
01:43:51.700
perhaps more than elon because look it's one party state yeah no one watches you for example
01:43:59.380
no one ever watched anything sort of you know i was i don't want to over-generalize but most people
01:44:05.620
didn't get any right wing or center-right opinions for uh for the most part didn't seek it it wasn't
01:44:13.780
there you're swimming and just you know liberal democratic uh sort of talking point
01:44:20.340
i'd say sax uh in the all in podcast was sort of the first time a lot of people started on a weekly
01:44:27.940
basis hearing a conservative doc being david sachs amazing and i would start to hear at parties and
01:44:35.700
things like that people describe their politics as sex sexism like they started calling it they
01:44:40.980
they were like you know i agree with you know most of the time i agree with you know sax's uh
01:44:46.820
point of view on all all in podcast like yeah you're kind of maybe moderate or center-right at
01:44:52.660
this point and he's so reasonable he's first of all he's a wonderful person i in my opinion but um
01:44:59.780
like i didn't have any sense of the reach of that podcast until i did it had no sense at all
01:45:04.820
all and he's like would you do my podcast sure because i love david sachs i do the podcast like
01:45:09.780
everyone i've ever met text me oh you're on all in podcasts like it's it's not my world but i didn't
01:45:16.660
realize that is the vector if you want to reach sort of business-minded people who are not very
01:45:22.580
political but are probably going to like send money to a buddy who's bundling for commonwealth because
01:45:26.980
like she's our candidate yes i mean that's the way to reach people like that that's right
01:45:31.780
right and by the way this is my point about technology can have a centralizing effect but
01:45:36.420
also decentralizing yes so youtube uh you can argue uh youtube is the centralized thing they're pushing
01:45:43.060
opinions on us whatever but you know now you have a platform on youtube after you you got fired from
01:45:48.180
fox right yeah uh you know sax uh can have a platform uh and put these opinions out and i think
01:45:55.780
uh you know there was a moment during covet that i felt like they're going to close everything down
01:46:02.740
yeah uh for good reason you felt that way yes uh and maybe there were maybe there's going to be
01:46:08.260
some other event that that will like allow them to close it down but you know one of the things i
01:46:12.900
really love about america is the first amendment it's just it's just the the most important uh
01:46:18.900
institutional innovation in the history of humanity i agree with that completely and we should
01:46:23.780
really you grew up without it too i mean it must be we should really protect it like we should like
01:46:29.220
we should be so coveting of it like you know we should be you know like your wife or something
01:46:35.460
can you agree hands off yeah can you just repeat your description of its importance historically i'm
01:46:42.500
sorry you put it so well uh it's the most important institutional innovation in human history
01:46:49.380
the first amendment is the most important institutional innovation in human history yes
01:46:54.820
i love that i think it's absolutely absolutely right and as someone who grew up with it in a
01:46:59.460
country that had had it for you know 200 years when i was born um you know you don't you don't feel
01:47:04.980
that way it's just like well it's the first amendment it's like just part of nature right it's like gravity
01:47:08.820
it just exists but as someone who you know grew up in a country that does not have it which is
01:47:13.620
true of every other country on the planet it's the only country that has it um you see it that way
01:47:19.540
you see it as the thing that makes america america well the thing that makes it so that we can change
01:47:24.260
course yes right and and the reason why you know we had this uh you know conformist um uh mob rule
01:47:37.380
mentality that people call woke uh uh you know um the reason that we're now past that almost you
01:47:47.300
know it's still kind of there but we're like we're on our way past that is because of for of the first
01:47:53.140
amendment and free speech and again i would credit elon a lot for buying twitter and letting us talk and
01:47:58.900
can debate and push back on the craziness right it's kind of it's well it's beautiful i've been a direct
01:48:05.620
beneficiary of it as i think everyone in the country has been so i'm not critic and i love elon
01:48:10.420
but i'm i mean it's a little weird that like a foreigner has to do that a foreigner foreign-born
01:48:17.060
person you elon appreciates it in this way it's like it's a little depressing like why didn't some
01:48:23.140
american-born person do that i guess because they don't we don't take it yeah you don't take it for
01:48:27.860
granted i wrote a thread it's like 10 things i i like about america i expected to do well but you
01:48:32.900
know it was like three four years ago it went super viral did wasser journal covered it peggy
01:48:38.500
newton you know called me and it was like i want to write a story about it i was like okay it's like
01:48:42.980
a twitter thread you can read it but uh you know i i and i just like talk about normal things you know
01:48:50.580
free speech one of them but also like hard work appreciation for talent and and all of that
01:48:57.380
and it was starting to close up right i started to see you know meritocracy kind of like
01:49:01.540
being less valued and that's part of the reason why i wrote the thread and what i what i realized
01:49:08.820
is like you know yeah most americans just just don't think about that and don't really value it
01:49:13.860
as much i agree and so maybe you do need you do need oh i think that's absolutely right i um but why
01:49:20.900
do you i mean i have seen i hate to say this because i've always thought my whole life that foreigners
01:49:25.540
um are great you know i like traveling to foreign countries i like my best friend is foreign born
01:49:31.620
actually um as opposed to mass immigration as i am which i am arabs really like you by the way oh
01:49:37.220
well i really like arabs thrown off the brainwashing um just a sidebar i feel like we had a bad experience
01:49:46.100
with arabs 23 years ago and what a lot of americans didn't realize but i knew from traveling a lot of the
01:49:52.100
middle east yeah it was bad it was bad however like that's not representative of the people that
01:49:58.980
i have dinner with in the middle east at all like someone once said to me like those are the worst
01:50:04.340
people in our country and right and no i totally agree with that strongly um i always defend the arabs
01:50:12.180
in a heartfelt way but uh no i i wonder if some of the particularly the higher income immigrants
01:50:23.060
recently i've noticed are like parroting the same kind of anti-american crap that they're learning
01:50:29.700
from the institute you know you come from punjab and go to stanford and all of a sudden like you've
01:50:35.780
got the same rotten decadent attitudes of your native-born professors from stanford do you see that
01:50:42.420
no i'm not sure what's the distribution like i mean speaking of of indians i mean on the right on the
01:50:47.940
right side of the spectrum we have vivek and who's the best yeah who's a perfect example of what i'm
01:50:52.980
saying like yeah vivek has thought through not just like first amendment good but why it's good yeah
01:50:58.180
well i i you know i i'm not sure you know i'm not sure good you know i think it's yeah i think
01:51:05.220
foreigners for the most part do appreciate it more but it's easy you know i talked about how i just
01:51:11.780
you know try not to be you know this conformist kind of really absorb everything around me and and
01:51:16.900
act on it but it's very easy for people to go in these um one-party state places uh and really
01:51:24.740
get you know become part of this like mob mentality where everyone believes the same thing any any
01:51:32.260
deviation uh from that is considered cancelable offense and you know you asked about the shift in
01:51:39.620
silicon valley i mean part of the shift is like yes silicon valley still has a lot of people who are
01:51:43.700
independent minded and they see this sort of conformist type of thinking in the democratic
01:51:50.500
party and that's really repulsive for them where you know there's like a party line it's like biden's
01:51:56.740
sharpest attack sharpest attack everyone says that and then the debates happen oh unfit unfit unfit
01:52:03.220
and then oh he's out oh kamala kamala it's like right it's like you know lockstep and there's like no
01:52:08.820
range there's no there's very little dissent within within that party and maybe republicans i think at
01:52:14.980
some point where we're the same maybe now it's uh it's it's sort of a little different but this is
01:52:20.180
what why people are attracted to to the other side and so by the way this is advice for the democrats
01:52:24.900
like if you want sort of silicon valley back you you know maybe don't be so controlling of opinions and
01:52:34.420
like be okay with more dissent you have to relinquish a little bit of power to do that
01:52:39.140
i mean it's the same as raising teenagers there's always a moment in the life of every parent of
01:52:43.380
teenagers where a child is going in a direction you don't want you know if it's a you know it's
01:52:48.340
shooting heroin direction you have to intervene with maximum force but there are a lot of directions
01:52:52.420
a kid can go that are deeply annoying to you yes and you have to restrain yourself a little bit
01:52:58.900
if you want to preserve the relationship actually if you want to preserve your power over the child
01:53:04.820
you have to pull back and be like i'm not going to say anything that's right this child will come
01:53:09.460
back to my my gravitational pull is strong enough but i'm not going to lose this child because
01:53:15.300
she does something that offends me today that's right you know what i mean
01:53:19.140
um you can't hold too tightly and i feel like they don't understand i feel like the democratic party
01:53:24.500
i'm not an intimate of course i'm not in the meetings but i feel by their behavior that they
01:53:30.260
feel very threatened that's what i see yeah these are people who feel like they're losing their power
01:53:35.780
yes and so they have to control what you say on facebook i mean what yes if you're worried about
01:53:40.660
what people say on facebook you know you've lost confidence in yourself that's right that's right
01:53:45.300
do you feel that yeah and i mean you know then yeah you know uh there's matt taibbi and michael
01:53:50.980
schellenberger and a lot of folks you know did a lot of great work on on censorship yes and the
01:53:57.220
government's kind of uh involvement in that and how they push social media companies i don't know
01:54:04.180
if you can put it just on the democrats because i think part of it happened during the trump
01:54:07.540
administration as well for sure but i think they're more excitable about it that really they
01:54:12.660
really love misinformation as a term which i think is a is kind of a bs term it's a meaningless
01:54:18.340
term it's a meaningless term all that matters is whether it's true or not yeah and the term
01:54:22.420
mis and disinformation doesn't even address the veracity of the claim that's right it's like
01:54:26.660
irrelevant to them whether it's true in fact if it's true it's more upsetting yeah it's like everything
01:54:30.900
what we talked about earlier is just making people stupid by taking their faculty of you're trying to
01:54:36.580
discern uh truth um i think that's you know that's how you actually become rational by like
01:54:43.140
trying to figure out whether something's true or not and then and then being right or wrong and and
01:54:49.860
and then that uh really um kind of uh trains you for having a better judgment what you talked about
01:54:58.100
judgment uh that's how people build good judgment you can't outsource your judgment to the group
01:55:04.980
which again is like feels like what's asked from us especially in in liberal circles is that
01:55:09.860
no fauci knows better two weeks to stop the spread you know take the job stay home wear the mask
01:55:16.900
you know it was just like talking down to us as children you can't discuss certain things on youtube
01:55:23.540
you'll get banned at some point you couldn't say the lab leak theory right which is now the mainstream
01:55:28.100
theory yes um and again a lot of this self-corrected because of the first amendment yeah and elon
01:55:35.540
wow that was uh that was as interesting as dinner was last night a little less profanity but i'm
01:55:41.380
really grateful that you took the time to do this thank you it's it's uh it's absolutely my
01:55:44.500
pleasure it was mine thank you thanks thanks for listening to tucker carlson show if you enjoyed it
01:55:51.540
you can go to tuckercarlson.com to see everything that we have made the complete library tuckercarlson.com