E563 AI CEO Alexandr Wang
Episode Stats
Length
2 hours and 9 minutes
Words per Minute
199.33678
Summary
In this episode, I chat with entrepreneur Alexander Wang about his trip to the inaugural, the future of AI, and the importance of social media in the world of business and technology. We also talk about what it's like to be a self-made billionaire at the age of 24.
Transcript
00:00:00.000
We hope you're enjoying your Air Canada flight.
00:00:10.720
Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:25.260
Wi-Fi available to Airplane members on Equipped Flight.
00:00:30.000
I want to let you know we've restocked all your favorites on the merch site.
00:00:45.880
I'll be in Chicago, Illinois on April 24th at the Wintrust Arena.
00:00:51.540
Fort Wayne, Indiana on April 26th at the Allen County War Memorial Coliseum.
00:00:57.260
And Miami, Florida on May 10th at the Kaseya Center.
00:01:01.960
Get your tickets early starting Wednesday, February 19th at 10 a.m.
00:01:10.360
We also have tickets remaining in East Lansing, Victoria, B.C. in the Canada.
00:01:36.540
He's a leader in the world of business and technology.
00:01:41.880
He started Scale AI, a company recently valued at $14 billion.
00:01:46.880
He started it when he was only 19, and at 24, he became the youngest self-made billionaire
00:01:56.240
We talk about his company, the future of AI, and the role it plays in our human existence.
00:02:36.440
Um, because you and I ended up, we like left out of there and then went and got lunch together,
00:02:43.480
Yeah, it was a, were you there the whole weekend?
00:02:45.780
No, I just got there, uh, the day, the morning of the inauguration.
00:02:51.880
Yeah, no, I mean, I think at least for what we do, like for AI, the, uh, the new administration
00:02:56.900
is really excited about it and wants to make sure that we, we get it right as a country.
00:03:00.580
So, um, that was all great, but it was kind of a crazy, the whole like event, like everything
00:03:07.880
I mean, when I saw Conor McGregor show up, that's when I was like, shit is, where are
00:03:13.460
It felt, that's what like the most bizarre, I mean, you were there, Sam Altman was there.
00:03:18.620
I was just like, it was like, whoa, what happened here?
00:03:22.540
Um, and part of that's because Alex Bruschewitz, you know?
00:03:25.920
I mean, it's all because of him that we all went, but just the fact that he would bring
00:03:29.060
these people together, it kind of makes you question crossover.
00:03:31.840
It's like, uh, it's like in, in those like TV shows when you have those crossover episodes.
00:03:37.580
When like the Harlem Globetrotters and it would be like them versus like, um, the Care Bears
00:03:52.940
Like, was there somebody you thought that was strange, like unique to see there for you?
00:03:57.760
I mean, you, Conor, like the, the, the Pauls, I don't know.
00:04:01.520
The whole, the whole thing was pretty, uh, pretty crazy.
00:04:04.600
And I see, uh, I see some of the tech people all the time.
00:04:07.120
I mean, it was funny crossover and it was obviously like the, so many people flew in to be able
00:04:15.480
And so, I mean, there were so many people in the city.
00:04:17.720
We ran into them obviously, but like there were so many people in DC, uh, who were just around
00:04:25.140
Oh, there was a couple hundred thousand people probably that just got kind of displaced in
00:04:28.680
And suddenly there's in bars or whatever, just like buying, like there was like people
00:04:33.480
I saw walking just adult onesies and shit that said Trump or on the, that's crazy shit.
00:04:49.800
And so when you grew up there, did you have like a strong sense of like science and technology?
00:04:56.880
Like, did your parents like lead you into that?
00:04:59.140
Like, what was just some of your youth like there?
00:05:01.760
So, uh, both my parents are physicists and there's a, I grew up in this town called Los
00:05:07.280
Alamos where there's a, there's a, there's a national lab there.
00:05:13.380
So the, so all the New Mexico shots in Oppenheimer, that's exactly where I grew up.
00:05:18.560
So it was like originally where all the scientists came together and, and did all the work in
00:05:25.880
That's basically the, everybody I knew effectively, like their parents worked at the lab or, or some
00:05:36.580
It was, you know, it, uh, at a level of mystery, like, is your prom like last night under the
00:05:45.300
Well, the funny thing is like, first you hear, you hear this, you learn the story of
00:05:50.300
the atomic bomb and the Manhattan project, like basically every year growing up, because
00:05:54.440
there's always like a day or a few days where, you know, there's a substitute teacher and
00:06:01.040
So you're just like, yeah, an alcoholic or something.
00:06:09.080
That's crazy that that just like every year you guys have like, yeah, just like blast it
00:06:15.200
And it's just, yeah, you're just learning, you're learning again about the Manhattan project.
00:06:19.600
And then there's a little, there's a little museum in town, uh, which is like you walk
00:06:23.760
through and there's like a, a life-size replica of the nukes.
00:06:35.220
They, yeah, they dropped them, uh, on, uh, in, in Japan.
00:06:41.100
Is it, is it crazy being like semi-Asia, part Asian?
00:06:49.080
Is it crazy being Asian and then having that happen to Asian people with the bomb?
00:06:56.220
I mean, I think the thing is like, you know, I, uh, so there weren't, I didn't grow up
00:07:00.400
with very many Asians, uh, cause the, in that town, it was, um, you know, it's, it's in New
00:07:08.260
So I was one of the only Asian kids in my class growing up.
00:07:12.080
And so, um, I didn't think that much about it, honestly, but then like, uh, but it is super
00:07:17.860
You know, you grow up and you learn about this very advanced technology that had this
00:07:28.700
It's like the scientific John Jones over there.
00:07:38.020
There's this, there's this energy always there of this creation.
00:07:42.020
So probably the possibility of creation maybe was always in the air.
00:07:45.080
I'm just wondering like, how did you get formed kind of, you know, like what's your
00:07:51.120
Cause, cause you know, there were all of these, there were all these like presentations
00:07:54.900
around what were the new kinds of science that were going on at the lab.
00:07:58.740
So there's all these chemistry experiments and these different like earth science experiments
00:08:05.440
Um, and my mom studied like plasma and like how plasma, you know, worked inside of stars and
00:08:13.760
So it was just the wildest stuff and you would talk to people's parent people, you like, I
00:08:19.280
would talk to my classmates or I talked to their parents about what they're working on
00:08:28.180
Cause, cause everybody in that town basically is, they're all working on some kind of crazy
00:08:35.400
And so you kind of, I mean, I feel like I grew up feeling like, you know, anything was
00:08:41.620
Like, yeah, cause the rest of us in other communities, shitty communities or whatever,
00:08:47.440
We're doing like grassroots level bullshit, you know?
00:08:55.740
So every, you see somebody just sneaking behind an alley and buying a bit of uranium and shit
00:09:00.240
like that in your neighborhood, that's gotta be kind of unique.
00:09:02.840
I remember there was someone from our town who did, uh, who did a science fair project
00:09:07.180
called, um, it's called like great balls of plasma.
00:09:10.180
And for the science, literally for the science fair, this was in like high school for the
00:09:13.760
science fair, they made like huge balls of plasma in their garage.
00:09:21.680
Like this is, we're all just doing this in high school.
00:09:26.900
So did you feel competitive or did you just feel like hyper capable?
00:09:31.180
Like, did you feel like kind of advanced just in your studies?
00:09:33.480
Like when you were young, like, did you were in class?
00:09:35.620
Did you like, are some of this stuff's kind of coming easy to me?
00:09:38.000
I ended up, what, what I did is, um, there was, uh, I ended up getting really competitive
00:09:46.360
And so, uh, so my sport was math, which is kind of crazy.
00:09:54.500
Um, and it was because when I was in middle school, when I was in sixth grade, there was
00:09:58.500
this one math competition where if you, if you got top four in the state, state of New
00:10:04.500
Mexico, then you would get an all expense paid trip to Disney world.
00:10:08.280
And, uh, and I remember as a sixth grader, like that was the most motivating thing I could
00:10:14.740
possibly imagine was like an all expense paid trip to Disney world.
00:10:23.460
So I snuck in there, uh, snuck in, I, uh, and then I went to, and then we went to Florida
00:10:29.260
to Disney world and I, uh, I hadn't traveled, like I didn't travel around too much growing
00:10:34.580
Like I mostly was in New Mexico and we did some road trips around the Southwest.
00:10:37.240
So I remember getting to Florida and it was extremely humid.
00:10:41.820
It was like, I never felt what humidity felt like when I landed in Florida.
00:10:55.860
Like you would like, you try to shake somebody's hand, you couldn't even land it because it was
00:10:59.800
just not enough of viscose, just too much lube in a handshake.
00:11:03.400
You'd sit there for 10 minutes trying to land a handshake, you know, everybody was
00:11:07.220
always dripping, you know, all the time people, you always thought they were scared or something
00:11:10.700
where they were kind of geeked up off a gas station dope or something, but people were
00:11:19.880
Uh, that was like a big part of my identity was being a mathlete.
00:11:39.360
And then, uh, but, uh, but outside of that, not really.
00:11:42.260
I mean, like everyone had their favorite, like his pencil, you know, your pencil, your calculator,
00:11:46.280
And, uh, but yeah, no, I, I was, I was like a, a nationally ranked mathlete for, uh, for
00:12:03.240
So you go to, what are there competitions you go to with that?
00:12:06.940
There's like, there's like state level competitions.
00:12:16.400
Um, where you, where you convene with like-minded mathletes.
00:12:23.740
Uh, a couple of dudes fucking fighting over a common denominator in the lobby.
00:12:29.420
But then you're, but just like any other sport, like you're like, it's competitive, you know,
00:12:35.380
And, uh, and so, so you're chummy with everyone, but you're also like, we're like, who's going
00:12:45.380
I think about when you're young, it's like, if you're involved in a sport or a group or
00:12:48.740
whatever it was, um, just the, just that chance to be competitive, you know, what are, what
00:12:54.200
were like some of the competitions at the math thing?
00:12:58.740
So you, uh, the ones in the middle school, middle school were actually more fun.
00:13:02.920
Cause there's like, there's almost like a Jeopardy style buzzer competition and stuff
00:13:07.720
But, uh, but in high school, once you get to high school, it's just, uh, you just take
00:13:11.980
a test and you just, it's like the, the biggest one in America is called the USEMO.
00:13:23.020
And, uh, and it's like, people all around the country take this test.
00:13:28.760
Uh, and there's, uh, United States of American mathematical Olympiad.
00:13:35.320
A participant must be either a US citizen or a legal resident of the US.
00:13:40.460
And then you, uh, and then it's a nine hour test.
00:13:45.260
It's nine hours you, it's four and a half hours and it's nine hours and you have six
00:13:51.640
So it's kind of nerve wracking and you get in there, you have four and a half hours the
00:13:55.480
first day, four and a half hours, the second day.
00:14:00.380
No, you, cause you only get three questions the first day and then you only get three
00:14:05.060
And, uh, I remember the first time I took it, I was in, I think I was, uh, I was in eighth
00:14:11.940
First time I took it and I like, I was like so nervous and I just, I like brought a thermos
00:14:17.520
full of coffee and I drank so much coffee that like those four and a half hours felt
00:14:33.920
So that's pretty, so, so you're competitive and so you get out of there and you're obviously,
00:14:38.580
I guess, um, admired in the world of math probably, or that's, that's like a thing
00:14:42.740
that like, uh, that you can point that that's like a pin that you, a feather in your cap.
00:14:48.340
So that, that helps you get into, um, MIT, right?
00:14:52.620
So then, uh, yeah, in MIT on the MIT admissions, they ask for your like competitive math scores.
00:14:59.920
So if you're, so, so you knew a lot of kids going there probably cause it was tons of kids.
00:15:07.520
Cause I was wondering where all y'all were at, dude, because we were doing some other
00:15:12.300
My, yeah, we were not, God, that's where all the smart kids were.
00:15:18.020
And so then, uh, yeah, I was at MIT and then, uh, MIT is like a really intense school because
00:15:25.240
they, you know, the classes they don't fuck around with.
00:15:28.260
They just really like, they load you up with tons of work.
00:15:31.000
And most of the time you have, you, you like, they load you up with like huge amounts
00:15:34.900
of work and they, you know, you, you, uh, you don't really know what's going on initially.
00:15:38.720
And so you're just kind of like, you're just trying to make it through.
00:15:41.520
But, uh, you know, there's this motto that, uh, that MIT has called, uh, IHTFP, which,
00:15:48.720
uh, among the students stands for, I hate this fucking place.
00:15:54.100
But then, but then the school, when you go, they tell you, oh, it stands for, um, I've truly
00:16:08.600
You do, you do a lot of work, uh, but it's, it's kind of awesome.
00:16:12.060
I mean, cause, cause I think the students really band together.
00:16:14.220
Like you, like, instead of, like, I think, uh, instead of it being competitive, really
00:16:19.200
MIT is much more about like everybody kind of like coming together, working on problems
00:16:24.640
together, working on homework together, and just kind of like making it through together.
00:16:33.960
It's a bunch of parties because people like MIT, there's a lot of people who like tinkering
00:16:39.380
So they're tinkering with like, you know, lights and big speakers and DJ sets and all
00:16:47.220
Cause they're like, the production value is high.
00:16:50.900
Um, and then what about when the science kids come through the lab dogs, are they bringing
00:16:58.160
Was there like designer drugs that are being created by actual people?
00:17:03.700
Like my friends in college, none of us would know how to do that, but there may be somebody
00:17:08.460
at a smart, at a more prestigious school that would know how, was that a thing even?
00:17:12.860
Or is that just, there's one, there was one part of the campus called East Campus, um,
00:17:19.800
And so there was like, at one point in the school year, they would, in their courtyard,
00:17:24.460
they would build a gigantic catapult, like a huge catapult.
00:17:35.340
People have, for decades now, people have been.
00:17:37.480
A trebuchet has like a rope attached to the end of it that flings it where a catapult just
00:17:49.380
Looking thing that would like, that would fling it.
00:17:51.780
And what are they just flinging Adderall into the rest of the campus?
00:17:54.260
They would fling stuff into the river, which I don't think, now that I'm thinking about
00:17:58.100
it, I feel like, I don't know what the, yeah, no, these giant, this giant like catapult
00:18:05.020
And so this was like a event that would go on and people would kind of rave there.
00:18:11.920
They were, they were like, they were into, they would like build a lot of stuff over there.
00:18:17.120
Like people that ended up at Burning Man later on.
00:18:19.520
That was the burning, that was core Burning Man.
00:18:21.700
Like there was a, there was a satanic ritual floor.
00:18:28.760
Um, but, uh, but there, yeah, so there's all these parties.
00:18:31.440
We, we bragged at MIT that, uh, you know, people from all the neighboring schools, cause
00:18:36.720
Boston is a huge college town, like tons and tons.
00:18:40.740
But no, MIT was fun, but I was only there, I was at MIT for one year.
00:18:46.000
You dropped out and then you, so you got into AI, into the AI world.
00:18:53.880
Is there, and I want to ask this cause I know Mark Zuckerberg dropped out of college.
00:19:00.000
Both of you guys have had success in tech and kind of, you know, forward thinking that sort
00:19:07.080
Um, is there something in college that you felt like didn't nurture you or did you just
00:19:15.520
Do you feel like college doesn't nurture a certain type of thinker or was it just a personal
00:19:21.820
I think for me, it was like, um, I was just feeling really impatient.
00:19:28.840
Um, and I don't really know why really, but I remember like, I remember I was in school
00:19:34.960
I, it was, it was really fun and, and I really enjoyed it.
00:19:37.700
But then I remember, you know, this in the year when I was at MIT was one of the first,
00:19:44.980
like, it was like one of the early big moments in AI because it was, um, I don't even remember
00:19:49.780
this, but there was an AI that beat, uh, the world champion at Go.
00:19:54.640
Um, this was in 2015, which is when I was in college.
00:20:00.780
It's like, uh, it's like a big checkerboard with like white and black stones.
00:20:06.080
It's like, uh, um, and it was, uh, yeah, this, this game AlphaGo versus Lisa Dole.
00:20:13.200
So AlphaGo versus Lisa Dole, also known as the DeepMind Challenge match was a five game
00:20:17.640
Go match between top Go player Lisa Dole and AlphaGo, a computer Go program developed by
00:20:22.440
DeepMind played in Seoul, South Korea between nine and 15, between nine, ninth and 15th of
00:20:32.900
Do you think that we got to, um, AlphaGo won all, but the fourth game, all games were won
00:20:40.500
The match has been compared with the historic chess match between Deep Blue and Gary, uh,
00:20:47.200
The winner of the match was slated to win $1 million since AlphaGo won.
00:20:49.700
Google DeepMind stated that the prize would be donated to charities, including UNICEF
00:20:56.600
That's just, um, but Lee received $150,000 for playing.
00:21:00.740
So this was a big moment because this had never kind of happened before.
00:21:06.020
And it was, it was a big moment for, uh, for AI.
00:21:09.960
It was like, oh wow, this stuff is like, it's really happening.
00:21:12.820
And so then this happened in March and I guess, yeah, I dropped out, started my company in
00:21:17.820
So I guess two months after this, I was, I was, uh, yeah, that's what the game looks like.
00:21:26.340
It's honestly, it's a really, uh, like, I'm not very good at the game.
00:21:31.340
It's a little more fun than playing chess unless you're like, you know, in a like Renaissance
00:21:40.140
You're out of the school and you're in the world.
00:21:43.580
You see, did that match, did realizing that kind of like spurn, you don't want to leave
00:21:48.160
school or did it, was that just something that happened around the same time?
00:21:52.520
Basically it was, it was, I remember feeling like, oh wow, AI it's happening.
00:22:01.700
And then, um, I felt like I had to, you know, basically that, that inspired me to start my
00:22:07.840
Um, and I moved, I basically went straight from, I remember this, I was, I flew straight from,
00:22:13.860
uh, Boston to San Francisco and then started the company basically.
00:22:25.220
Like what are your kind of, are you just like knew like, this is where it's going?
00:22:28.900
Like you just felt there was something, an instinct that you trusted or like, because
00:22:40.760
And then there was one class where, um, you had to like on all the classes you had to do
00:22:48.080
And in one of them, I wanted to build a, um, like a camera inside my refrigerator that
00:22:54.500
would tell me when my roommates were stealing my food.
00:23:01.920
And, uh, but then, so I worked on that and then it was, there was one moment where,
00:23:07.500
where, uh, I was like, there was like, there was a moment that clicked where I was trying
00:23:12.840
And then there was one step that was like too easy.
00:23:15.180
I was like, Whoa, that just worked right there.
00:23:23.220
And so I, did you ever market those refrigerators?
00:23:30.480
Every dorm has it where there's a camera built in and you just get, you're in, you get a notification
00:23:35.880
You know, you're like, damn, Adnan's got my hummus, you know, but you got video of them
00:23:57.920
Um, he's motivated by seeing that AI starting to actually overtake humans, right.
00:24:02.760
Or be able to compete with actual human thinking with the chess match.
00:24:07.760
I would, the way I would think about it or the way I thought about the time was like, this
00:24:10.960
is, this is becoming, um, I, you know, people, you know,
00:24:16.520
Like it's kind of been always been one of these things that, um, people have, have said,
00:24:20.860
oh, it's going to happen, but it never really was happening.
00:24:22.680
And it felt like it was, you know, it was really about to happen.
00:24:27.220
Cause I've always heard about artificial intelligence.
00:24:30.020
So I've always been like, um, no, you're, you are, you have real intelligence, not artificial,
00:24:35.960
I don't, I mean, I think it's a, it's probably a mix, but I see what you're saying.
00:24:42.360
It was like, what if they had like a Mexican version?
00:24:57.180
What are some of your relationship green flags?
00:25:02.060
What are things you notice in your relationship that are like, yes, this is positive.
00:25:10.620
We often hear about the red flags we should avoid, but what if we focused more on looking
00:25:19.920
If you're not sure what they look like, therapy can help you identify green flags, actively
00:25:25.200
practice them in your relationships and embody the green flag energy yourself.
00:25:30.540
I've personally benefited from therapy over the years.
00:25:33.820
I think one of the things it's helped me with is just noticing some of my patterns, noticing
00:25:38.620
when I let something small, um, turn into something big and being able to cut it off at the past.
00:25:50.300
Better help is fully online, making therapy affordable and convenient, serving over 5 million
00:26:02.140
Visit better help.com slash Theo to get 10% off your first month.
00:26:13.200
Are you struggling to keep your website and digital marketing up to date while running your
00:26:19.920
Now you can relax and binge on unlimited web design services with modify.
00:26:26.600
Enjoy unlimited support, a quick turnaround, and your own designated designer.
00:26:34.620
Modify has done my website for years and they're incredible.
00:26:48.360
That's M O D I P H Y.com slash T H E O for 50% off.
00:27:07.880
So, so AI is all about, you know, basically programming computers to be able to start thinking like
00:27:15.700
So, you know, traditional computer programming, uh, you know, it's pretty, it's pretty bare
00:27:23.260
And so AI is all about, can you have, can you build algorithms that start to be able to
00:27:28.560
think like people and, and replicate some of the, like our brains are these incredible,
00:27:35.680
incredible things, you know, and that's, that's evolution.
00:27:41.080
Um, and so it's all about how can we, how can we build something similar to that or replicate
00:27:48.720
And so the whole, you know, the, the, the whole modern AI, um, era really started around,
00:27:57.080
um, an area called computer vision, but it was like, how can we first get, um, computers
00:28:06.500
So one of the very first, so, you know, one of the very first AI projects was this thing
00:28:12.600
called image net, um, image net and later Alex net.
00:28:16.420
And it was basically, can you get computer programs to like given a photo to tell you
00:28:23.100
So I see, so just like a human would, like if you showed them this, you're starting to
00:28:31.400
Train them to, well, actually originally, like, uh, you could like, let's say you took
00:28:35.440
a photo of this, of this bottle, a machine wouldn't even be able to tell you what's in
00:28:39.580
It would just know what the pixels were, but it wouldn't be able to tell you like, oh,
00:28:45.640
So the, you know, one of the first AI breakthroughs was when YouTube, um, built an algorithm that could
00:28:54.720
And that was like this, you know, in like 2012 or 2011, this was like this mind blowing
00:29:00.080
breakthrough that you could like figure out, you could like use an algorithm to figure out
00:29:04.400
when there was a cat inside, uh, inside a video.
00:29:07.220
And so AI, it started pretty simply with just how do we replicate, you know, um, vision?
00:29:15.000
Like how do you replicate like our, basically the fact that our eyes and our brains can like
00:29:21.540
And that really led to, I think one of the first major use cases of AI, which is self-driving
00:29:28.420
So this was it, when we, when I started the company in 2016, self-driving cars were all
00:29:34.160
the rage because, um, you know, it was, you know, there were all these like skunkworks
00:29:41.940
So when you, when you kind of got into AI at that time, self-driving cars are the, are
00:29:48.920
And so it was, it was all about, can, you know, um, can we start building algorithms
00:29:53.640
that can drive a car like a human would and do it safely and do it, you know, more efficiently.
00:29:58.560
And that way, instead of, that was, that was one of the first major areas.
00:30:03.820
And then, um, now, you know, I, the whole, the whole industry has moved so fast, but then
00:30:09.700
all of a sudden we got chat GPT and we got, you know, more advanced stuff more recently
00:30:14.300
that, that is able to talk like a human or sort of think like a human.
00:30:17.740
And so it's really come pretty far recently, but all of it is about how do you, you build
00:30:23.700
algorithms, how to use machines to be able to think like a person.
00:30:27.160
And is it a pro if it's like, say if I opened a door, like, like, oh, we keep the AI in
00:30:43.640
So the first part is you need really advanced chips.
00:30:48.840
So you need like, uh, uh, like these, they're called GPUs or sometimes called TPUs or, you
00:30:55.000
know, there's a lot of different words for it, but you need like the most advanced computer
00:31:04.540
They, I mean, the biggest ones are actually like the whole chips are like, you know, pretty
00:31:09.060
big, but they, they're like a, they're like a, like a wafer kind of thing.
00:31:13.020
But then you, um, you put a lot of them all together.
00:31:20.460
Um, these like, uh, these chips and the, the biggest ones are, are, yeah, exactly.
00:31:29.400
Um, but these are the, so this is the, these are the brain cells of it.
00:31:41.960
So you need a ton of these chips and that's, that's kind of the, the, like, uh,
00:31:48.500
And then, um, and then by the way, they take a huge amount of energy.
00:31:52.280
They're really, uh, they, cause they have, you have to do a lot of, you know, calculations
00:31:56.480
and there's a lot of math that has to happen on them.
00:31:58.560
Um, and so, um, and you have, you have giant, basically data centers, buildings full of
00:32:04.780
like tons and tons of those chips just in giant rows.
00:32:10.060
The biggest one, I mean, like, um, like Elon's data center Colossus is like, uh, I mean, it's
00:32:18.080
It's definitely more than a million square feet.
00:32:27.240
That, that, uh, you see that building with the sunset second row.
00:32:44.000
I haven't been in that one, but I've, I've been in some of these and it's just, and this
00:32:56.740
Um, so the first part is just the, the, that's the physical presence.
00:33:04.900
So then you have like on top of those chips, you have, you have software that that's running
00:33:10.840
and the, the algorithms are just all are like what you actually are telling.
00:33:15.140
What's the math that you're telling to happen on the chips.
00:33:17.540
And those algorithms are, you know, some of the, you know, most complicated algorithms
00:33:24.660
And that's kind of the, that's kind of the software part, or that's kind of the, that's
00:33:29.140
the part that like, you know, exists on the internet or you can download or whatnot.
00:33:33.000
And then it has to be run on these like huge warehouses of giant, of giant chips.
00:33:37.960
Um, so when someone goes to like scale AI or chat GPT, these are all AI interfaces or
00:33:47.440
So, so, um, yeah, like chat GPT is a, is a way to be able to, uh, talk to basically you
00:33:56.020
So you can start interacting directly with the algorithm.
00:34:00.520
Um, so you could say to this, um, can you describe the weather today?
00:34:06.100
And if you said that to five different AI company or AI companies, basically.
00:34:12.800
Or AI algorithms, different, different AI systems.
00:34:15.460
So if you said that's five different AI systems, you might get a little bit of a varied answer.
00:34:20.140
You'll get, you'll get, cause they all are trying to have their own style and have their
00:34:25.180
And then what we do, what, what scale AI is all about is we've kind of built the, uh,
00:34:32.740
So a lot of what we're trying to do is how do we, so how do we help produce data that
00:34:44.920
But, uh, if I slow down the, if I, if I, if you're losing me, I, but so explain that
00:34:52.700
So with these algorithms, um, one key ingredient for these algorithms is data.
00:34:59.540
So you have the chips and everything that are, that are storing all the information.
00:35:04.320
And then you have the algorithms that are helping mediate between the user and the data.
00:35:09.540
So basically you kind of have, yeah, you have three, you have three key pieces.
00:35:13.900
So you have the, uh, you have the computational powers of the chips, you have the chips, you
00:35:19.240
have the data, which is like just tons and tons of data that that's where the algorithms
00:35:27.060
So these algorithms, they aren't just like, they don't just learn to talk randomly.
00:35:30.740
They learn it from learning to talk from how humans talk.
00:35:35.740
And then you have the algorithms, which, uh, learn from all that data.
00:35:43.460
Um, so then one of the big challenges in the industry is, okay, how are you going to produce
00:35:50.860
And so, um, uh, this is, how are you going to get data for your system?
00:35:59.280
How do you build, how do you build all that data?
00:36:01.140
And how do you do that, um, in the most effective way?
00:36:05.820
So clean data, because what if you get a bunch of data in there, that's just a bunch of advertisements
00:36:14.420
So this whole data, so data is, is, you know, some people say like data is the new oil
00:36:20.640
Like data is really, really valuable because it's, it's how the algorithms are learning
00:36:27.480
Like any, anything that the algorithms know or learn or say, or do all that has to come
00:36:35.580
So, so if I ask the, uh, if I ask a system, an AI system, a question or ask it to help
00:36:40.500
me with something, help me to design something or to curate an idea, it's going to use the
00:36:46.400
data that it has within it to respond to me and help me, uh, and help give me an answer
00:36:53.540
And it's only, and the data it has in it is only based upon the data that, um, is put
00:37:01.520
So then, so yeah, it's kind of, um, so then we don't, you know, we don't spend enough
00:37:07.660
time talking about where it is, you know, how are you going to get this data and how
00:37:13.140
So the, the angle that we took at scale was to kind of, um, turn this into an opportunity
00:37:20.720
So, um, we're, you know, we're kind of like the Uber for AI.
00:37:24.160
So just like how Uber, you have, you know, riders and drivers for us, we have, uh, you
00:37:30.780
know, we have the AI systems, you know, the algorithms that need data.
00:37:35.860
And then we have, um, a community of people, a network of people who help produce the data
00:37:43.980
Ah, so they're almost data farming, like creating good data or creating good data.
00:37:51.360
It's like, uh, so we do this through, um, our platform called outlier.
00:37:56.520
And, uh, uh, outlier last year, uh, people, contributors, we call them contributors on
00:38:02.480
outlier earned about $500 million total across everybody.
00:38:08.000
Um, in the U S that's across 9,000 different towns.
00:38:12.480
Um, and, uh, so it created a lot of jobs, a lot of jobs.
00:38:22.000
So we, uh, yeah, I mean, yeah, scale AI is an AI system.
00:38:41.880
We, we, we build this platform that anybody, you know, a lot of people all, frankly, all
00:38:47.440
around the world, but Americans too, can log on and, uh, and help build data that goes
00:39:00.200
Like, what is an example of somebody who's helping build data for an AI database?
00:39:05.140
Let's say you're a nurse, like you're a nurse with like tons of tons of experience.
00:39:08.660
So, you know, a lot about how to take care of people, um, and, uh, take care of, of people
00:39:14.160
who are sick or, you know, have issues and whatnot.
00:39:16.100
And, uh, so you, you could log onto the system and, and this, and our platform, and you could
00:39:22.480
see that the algorithm is, you know, let's say you ask the algorithm, like, Hey, I have
00:39:27.060
a, you know, I have a pain in my, in my stomach.
00:39:30.960
And you notice that the algorithm says the wrong thing.
00:39:33.440
Like the algorithm says, Oh, just, you know, hang out and, and, you know, it'll go, it'll
00:39:40.660
Like I, you know, you have to, you have to go to the emergency room because you might
00:39:44.460
have appendicitis or you might have, you know, you might have something really bad.
00:39:47.620
And so you would, as a nurse, you would go in and you would basically correct all these
00:39:54.120
And then that would then feed it back into the algorithm.
00:39:58.200
So it's kind of this, this continual process of, and there's versions of that for whatever
00:40:03.820
your expertise is or whatever, you know, you know more about than anything, everything,
00:40:13.320
And how do you know if their information is valuable or not?
00:40:19.860
So we, we, we have a lot of systems to make sure that people aren't spamming and that like
00:40:24.280
you're saying it's not, it's not, you know, it's not garbage in that's going into the,
00:40:29.500
Um, so we have, you know, we have kind of like people check the work of other people to
00:40:35.880
And we have some like automated systems that check this stuff.
00:40:38.800
But, uh, but for the most part, it's, it's like, it's really broad.
00:40:42.160
Like we want experts in anything, everything, shellfish, train tracks, whatever.
00:40:48.360
Gelatin, everything, childhood, death or whatever.
00:41:00.240
So it's kind of like your data is almost like an ocean or a body of water and you, different
00:41:05.300
places are going to be able to keep their body of water cleaner or dirtier and different
00:41:09.660
infections could get in different spyware, all types of stuff.
00:41:12.480
So, and if you have the really a clean body of water, then you're going to be able to
00:41:16.520
offer a clean data or a certain type of data to, um, people who are using your AI platform.
00:41:27.220
And our job is like, how do we make sure that this body of water is as clean as possible
00:41:31.680
and we fill it up as much as possible, that it has as much information about everything
00:41:38.180
So is there almost a race for information right now in a weird way or no?
00:41:44.340
I think that there's a, well, there's a race for like, how are different AI systems competing
00:41:51.240
So the, there's a, there's, it's, it goes back to the three things I mentioned.
00:41:54.980
So there's kind of like three dimensions that, that they're all competing as one another.
00:42:06.480
Like who has the most chips, um, that they're utilizing, uh, data.
00:42:10.360
So the kind of body of water, whose body of water is better, cleaner, you know, uh, healthiest,
00:42:18.280
So who's, and this is where the scientists really come in.
00:42:21.540
And it's like, okay, who's coming up with the cleverest algorithms or who has like a trick
00:42:26.960
on an algorithm that somebody else doesn't have.
00:42:28.940
Like who's doing that to basically, um, make the AI learn better off of the data that it
00:42:43.160
And I think AI scares people because the future scares people, right?
00:42:46.340
It's like, that's one of the scariest things sometimes is the future.
00:42:49.320
So I think a lot of times people, you associate, um, cause a lot of times when people mention
00:42:55.160
It seems like from people, um, there's fear that it's going to take jobs.
00:42:58.580
There's fear that it's going to take over, um, the, our ability to think for ourselves.
00:43:03.720
There's just kind of some general uncertainty there.
00:43:06.060
Like is this, and it kind of feels like fear a lot of times, but a lot of times fear is
00:43:13.100
Um, and not a lack of knowledge because you didn't want to know just cause you don't know
00:43:17.220
or that you're dumb, but just cause you don't know.
00:43:20.220
Um, what are, what are positive things that we're going to see with AI?
00:43:26.080
So I think first, like we don't, the AI industry, we don't do the best job explaining this.
00:43:31.920
And I think sometimes we make it seem all sci-fi and, and, and genuinely we were part
00:43:40.100
Uh, but you know, one thing for example, is like, I think AI is actually going to create
00:43:47.340
Um, and that story is not told enough, but you know, these jobs that we're producing or
00:43:52.680
this, this sort of, um, this opportunity that we're providing on our platform outlier,
00:44:02.400
And the only place you can get new data is from people will, will at a certain point
00:44:06.160
with the system, be able to create, it can probably matriculate data or matriculate.
00:44:15.540
It can probably like quantify or, and, and, and give you answers, but it can AI create new
00:44:24.040
So, so I think, well, it can do a little bit of that.
00:44:26.700
So it can, AI can help itself create its own data, um, and, and help itself a little
00:44:32.500
But ultimately most of the progress is going to come from, you know, people who are able
00:44:38.540
to help really the model get better and smarter and more capable at all these, all these different
00:44:45.980
I didn't understand that we are the ones who are giving it, um, information.
00:44:51.620
And since we're going to continue to learn, I would assume that we would be able to help
00:44:57.780
And the world's going to keep changing and we're going to need to be able to keep teaching
00:45:01.480
the, the algorithms, keep teaching the models about how the world's changing.
00:45:05.000
So, you know, uh, this is actually a, a big thing that I think most people don't understand.
00:45:11.360
The people who are getting the opportunity now who are earning money from it see it, but
00:45:15.720
as AI grows, there's actually going to be tons of jobs created along the way and tons of
00:45:22.420
opportunity for people to help improve AI systems or control AI systems or overall, um,
00:45:29.520
sort of be a part of the, the technology, not just sort of disenfranchised by it.
00:45:35.840
So what were, what are you, do you feel like are other ways?
00:45:38.140
Like if you had to look into the future a little bit, right?
00:45:40.300
So you have the fact that people are going to be able to add more data, right?
00:45:46.380
And, um, and probably humanize data a little bit.
00:45:54.540
I think you're going to have a lot of, um, a lot of jobs around, you know, as AI starts,
00:46:01.620
um, doing all these little things throughout the world, who's going to keep watch of those
00:46:07.540
AI and who's going to make sure that those AI are, uh, aren't doing something that we
00:46:12.540
So almost like managing the AIs and, and keeping watch over all the AI systems, that's going
00:46:18.160
to be another thing that we're going to have to do.
00:46:19.760
Um, and then, and then it's just somebody to kind of guide the river a little bit.
00:46:25.060
At certain point, guide the stream, stay in there, uh, watch, make sure that answers are
00:46:35.580
Like, like I think for example, um, you know, we're not going to just have AIs going around
00:46:41.780
and, you know, um, you know, buying stuff and doing crazy things.
00:46:47.520
And like, you know, we're going to, we're going to keep it controlled, right?
00:46:50.760
Like as a society, I think we're going to keep it controlled as a technology.
00:46:54.200
And I think there's going to be a lot of jobs for people to make sure that the AI doesn't
00:46:58.960
go out and do crazy things that we don't want it to do.
00:47:02.120
So we want to be, so you're going to need managers.
00:47:07.660
Like, what are things that like, will it eventually be able to have enough information
00:47:12.760
like our data where, uh, where it can like cure diseases and stuff like that?
00:47:25.580
Heart disease, like all these, all these diseases.
00:47:28.880
Cancer on its heels, but, but sorry, Antonio Brown, Antonio Brown was just here.
00:47:35.260
Um, no, but seriously, I think that AI, uh, one thing that we've seen, which is, this is
00:47:42.000
kind of wild, but AI understands like molecules and biology better than humans do actually,
00:47:49.940
because, um, it's, uh, like, like, uh, there, there's this, there's this thing in AI where,
00:47:56.860
you know, it used to take a, like a PhD biologist, like, you know, five years to do something that
00:48:04.780
the AI can, can just do in, you know, a few minutes.
00:48:08.580
And that's because the, like, uh, just the way that molecules and biology and all of
00:48:14.520
that works is something that, that AI happens to be really good at.
00:48:17.680
And that's going to help us ultimately cure diseases, find, um, you know, pharmaceuticals
00:48:24.060
or other treatments for these diseases and ultimately help humans live longer.
00:48:34.440
Um, so it's going to be a huge tool for us to, to cure disease, um, for us to help educate
00:48:40.200
people, um, for us to, you know, there's a lot of, a lot of really exciting uses for AI,
00:48:45.360
but I think the kind of, I think the thing that, um, will touch most people in their lives
00:48:51.220
is it's really going to be a, um, like a tool that'll help you, um, you know, make all of
00:48:58.560
your sort of, uh, make all your dreams kind of become reality, if that makes sense.
00:49:03.400
So, so I think one of the things that AI is going to be really awesome for is like, you
00:49:08.980
know, today, if I, I have like a million ideas, right.
00:49:12.680
I have like, you know, thousands and thousands of ideas and I only have so much time.
00:49:16.640
So we can only really do, you know, a few of them at a time.
00:49:26.420
And I think a lot of people, you know, for whatever reason, they may have some of the
00:49:30.400
best ideas ever, but they just, you know, they're too busy or they have like other shit going
00:49:37.660
And it's great with sometimes when people are able to make the leaps and make them happen
00:49:45.280
And one of the things that AI is going to help us do, I, I legitimately think so, is
00:49:49.900
it's going to help us, um, turn these ideas into reality much more easily.
00:49:54.880
So, you know, you can, um, like, you know, you're making a movie, let's say you have another
00:50:00.740
You can say, you can ultimately, I think you'll be able to tell an AI, Hey, I have this idea
00:50:09.580
Um, also who are the people who can help, you know, fund this idea?
00:50:13.360
Like who, who those people be can help reach out to them.
00:50:16.100
And then like, you know, who should we cast in it?
00:50:18.480
Basically help make the whole thing a, uh, you know, instead of those like daunting thing
00:50:23.540
that these big projects, they're usually so daunting.
00:50:28.160
You kind of need a person to help you get through them instead of that AI will help you
00:50:31.940
get through it and like help do a lot of the, the sort of less glamorous work to make them
00:50:37.600
So I could say, for example, like, um, like AI, I would like to shoot maybe, I'm thinking
00:50:43.140
about creating an idea or shooting a film in this area or, or it's like this, it's going
00:50:50.460
I can give it like a, a, a outline of the characters, like what they look like their ages
00:50:56.360
Could you help give me, um, possible potential actors or something within a certain price
00:51:05.060
Could you help give me like locations around the country that would fit that backdrop?
00:51:10.140
Um, could you, uh, list me all the talent agencies that I could reach out to and you could kind
00:51:17.260
of just put those things in and then you would have sort of a, uh, a bit of a guidebook
00:51:23.260
at that point that would make your, what before was something that felt extremely daunting
00:51:28.260
shit in two minutes, you know, and then you put it in the AI, it gives you the information
00:51:34.680
And I think, I think over time, it'll also be able to start doing a lot of the legwork
00:51:38.580
So it'll be able to reach out to people for you.
00:51:40.740
It'll be able to, you know, uh, figure out the logistics.
00:51:45.800
Like it'll be able to basically help do all the legwork to make it make, you know, whatever
00:51:51.420
Um, so very much an assistant in a lot of ways.
00:51:56.880
Um, you know, the, the hot word in the AI world is agents, but you know, it's just,
00:52:01.940
it, it'll be something that'll help you, you know, you humans are going to be in control.
00:52:06.000
Humans are ultimately going to be telling the AIs what they want it to do.
00:52:13.020
It's ultimately going to be to like help us execute on and accomplish all of these ideas
00:52:18.500
Um, so a genius could be three or four X if somebody's like a genius in something in
00:52:23.660
some realm or space of thought, you could three or four X them because totally like
00:52:28.820
multiply their, their output from their own brain because they could have something really
00:52:34.520
helping them, um, get done a lot of the, like the early work on things and maybe some
00:52:41.940
Like what, I think one thing that I always feel like kind of sucks is that, um, if you
00:52:46.620
have a director you really like, they're only going to make a movie once every couple of
00:52:50.360
So even if you have a director that you'd like think is amazing, you know, they just,
00:52:53.960
it's hard for them to make that many movies like, cause it just takes so much time and
00:52:57.860
effort and you know, um, there's so many bottlenecks and stuff.
00:53:01.320
So in a future with, you know, more advanced AI systems, they could, they could just churn
00:53:07.500
them out and, uh, they could, they can make so many of their ideas into a reality.
00:53:11.700
And I think that's true not only in creative areas, but it's kind of true across the board.
00:53:15.500
Like, you know, you can start new businesses more easily.
00:53:18.140
You can, um, you know, you can make various creative projects that you have happen more
00:53:22.620
You can make, like, you can, you can finally plan that event that you and your friends
00:53:26.720
have been talking about for like, you know, years and years.
00:53:29.600
So it can just like really help you, you know, we think about as like giving humans more agency,
00:53:35.200
giving humans more sort of sovereignty and just, and just enabling humans to get way more
00:53:43.040
I like some of this thought because yeah, I could be like, like my fantasy football group
00:53:47.160
and I, we do a, um, we do a, uh, draft every year in a different location, you know,
00:53:55.080
J rod, everybody that's in it for the past 17 years, we've flown to a city each year
00:54:11.300
Uh, you know, these would kind of be the nights and it, you know, just to really give
00:54:14.700
me like a, just a nice plan of, Hey, here's 10 possibilities.
00:54:18.780
And then even more like with a movie, this is what I worry you'd, you'd run into.
00:54:23.360
Say if I'd be like, okay, I have two main characters and this is kind of what I would
00:54:27.840
Could you help me with a first act of a three act movie?
00:54:33.760
Everybody just doesn't get the same movie then.
00:54:36.340
Like, that's what I would start to worry that everything that you have to create is
00:54:41.520
So then I think this is where it comes into like, this is where human creativity is going
00:54:46.120
to matter because it's going to be about, then it's about like, okay, what is the, how
00:54:51.000
am I, you know, how am I directing the AI system?
00:54:54.180
Like, what are my, what are the tricks I have to make the AI give me something that's different
00:54:59.100
And that's, that's not different at all from, you know, how creative stuff works today.
00:55:03.560
Like even on social media or anywhere, you know, this, like you still add your own spice
00:55:09.200
You always need to like have something that's going to make it like different and interesting
00:55:16.100
Like we're, humans are always going to have to figure out based on where culture's at,
00:55:19.600
based on where, you know, what the dialogue is, what the, what the discourse is, all that
00:55:26.620
That's where, that's really one of the key things that humans are going to have to keep
00:55:31.720
And, and, and some of the, like a lot of films and books, a lot of it is, there's just like
00:55:38.120
There's a, there's maybe a information you learn.
00:55:42.240
Then there's a red herring and then there's a solution.
00:55:46.680
So if something just gave you the basis and then you go through and make everything your
00:55:51.980
own, um, cause a lot of things we don't, there's only so many templates for things.
00:55:59.440
So, so say for example, say you might need to hire people at a company then that would
00:56:04.640
help direct your AI, like somebody who's good at managing AI, uh, and giving it the
00:56:11.840
best prompts or the best way to ask it questions, uh, to get the perfect feedback for your company.
00:56:20.800
You, those would be actual new people you would need.
00:56:25.340
Well, first I think just like helping to, you know, kind of what we were talking about
00:56:29.760
before these jobs around helping to improve the data and, and, um, contribute to the AI
00:56:37.160
That's just going to keep growing for a long, long time.
00:56:39.620
And then as AI gets better and it gets used in more areas, then you're, there are going
00:56:44.220
to be a lot of jobs, uh, that pop up just exactly as you're saying, which is how do you,
00:56:51.520
What's the best way to leverage it to, you know, um, actually make some of these, uh,
00:56:57.720
these applications or, you know, whatever you want to build a reality.
00:57:01.580
And then, and then there's going to be folks who need to, once those are built, like, how
00:57:13.220
And then how do you also make sure it doesn't do anything bad, right?
00:57:15.940
Like how do you make sure that the AI doesn't accidentally spam a million people or whatever
00:57:20.260
it might be, um, and you make sure that it sort of is, uh, like operating in a good way.
00:57:25.980
Fahim Anwar, I was watching him bring up a picture of Fahim.
00:57:29.360
This is one of the funny, this guy, this is the most creative comedian, uh, in America.
00:57:41.180
He had any, everybody would say he's, he's one of the few comedians that everybody goes
00:57:46.760
Um, he had a bit the other night he talks about, he got into a Waymo, right?
00:57:52.620
And I see so many Waymos now, which are cars that are just, nobody's in them, you know,
00:57:58.240
So he had this bit, he's like, he got into a Waymo and it started complaining about its
00:58:06.460
And he's like, what the, and he's like, now I, no matter what, I still have to talk to
00:58:12.200
Um, if you get a chance to see him though, that guy is, he's fascinating.
00:58:16.880
Um, but like, so what, what companies right now should kind of look to hire an AI guy?
00:58:23.460
Like we had some, uh, we had Cat Williams on a last week and we're like, Hey, can you
00:58:28.940
create visuals of Suge Knight and Cat Williams riding bicycles down Sunset Boulevard?
00:58:33.600
And this is the one they sent back a little while later.
00:58:40.260
And this is just, this was right where I say it was like by the comedy store on Sunset
00:58:47.180
I mean, this looks like it's out of a movie kind of.
00:58:50.080
I mean, and the guy did that in a little, in just a little bit of time.
00:58:57.880
If I was healthier, if I had healthier gums too.
00:59:01.620
What kind of like what companies right now, what job spaces?
00:59:06.760
But I don't really, my brain is like, well, what do they do?
00:59:12.620
You know, I could get them to make some animations and ideas, but what type of people need an
00:59:21.940
It's kind of like the internet where, you know, eventually everybody's going to need to figure
00:59:27.620
out how to, how to utilize it, how to best, how to best use it for their industry or whatever
00:59:34.720
Like it's something that I think everybody is going to, is going to need to adopt at some
00:59:40.740
So, um, you know, might as well start earlier because eventually just like how, you know,
00:59:46.660
every, basically every company has to figure out how to use the internet well, and how to be
00:59:50.560
smart about, you know, the internet and digital stuff.
00:59:53.420
Every company is going to have to be smart about AI, how to use AI, how to make, um, how to have a
00:59:58.960
unique twist on it so that, you know, their stuff stands out relative to other people's.
01:00:03.980
But, um, so we see, I mean, in our work, you know, we work with all these, um, all these big
01:00:10.240
companies in America and we see it everywhere from, you know, uh, we worked at Time magazine
01:00:15.440
on some stuff and then we worked with, uh, Toyota on some stuff for their cars.
01:00:19.980
And we worked with, um, you know, large pharmaceutical companies for the biology stuff we were talking
01:00:24.540
about, large hospitals for, you know, helping to treat patients.
01:00:30.140
Um, and I think that goes for, you know, obviously these like really big businesses,
01:00:34.480
but also for, for smaller businesses, you know, there's always interesting ways to utilize
01:00:38.620
it to, to, uh, to provide a better product or a better experience or better content for,
01:00:47.140
Cause I guess right now we're like, there's certain moments like, Hey, well, let's animate
01:00:51.140
It adds some visual effects to some of our episodes.
01:00:53.940
So that's something we'd like to do to just be fun and creative.
01:00:57.140
I would like to maybe create like some sort of an animated character.
01:01:00.400
We already have a great animator and we want to keep that, but to have an AI space where
01:01:04.040
it's like, you know, cause they have something, they had a little cat the other day or something
01:01:07.500
and he was going to war and I was like, damn dude, this is, and it was AI, you know, totally.
01:01:12.860
And they had a baby who was getting in a taxi and I was like, this shit is elite, you know,
01:01:16.420
I don't know if it's illegal or not, but it seems, you know, it doesn't seem, you know,
01:01:20.780
it's definitely a tool for storytellers, right?
01:01:23.940
It's, it'll help people with a creative vision or, or war cats.
01:01:39.820
This honestly, it looks, it looks bad-ass and it looks like cats have been waiting to do
01:01:47.880
Now their outfits finally match the look in their eyes.
01:02:06.900
But it's almost like you could make, like, that's what I want to get.
01:02:09.540
I want to get somebody to help us think, Hey, help make these little segments that we
01:02:15.080
And instead of me thinking, man, I got to write this huge, crazy script for just a kind
01:02:23.020
I think that's actually the key thing, which is AI, like AI will just be something that
01:02:31.400
Like we help make our ideas and our dreams and our, you know, whatever we want to do
01:02:39.220
Um, who's the, who's the current leader in AI development?
01:02:48.200
Is it trying to think of another, uh, superpower, uh, Russia maybe?
01:02:54.640
Taiwan I know has a lot of the chips and they manufacture a lot of the chips over there.
01:02:59.380
Um, and does it matter what country leads in AI or does it just matter the company like
01:03:13.680
So, so today America is in the lead, but, um, but China as a country is, is sort of hot
01:03:21.120
Um, like there was, uh, there was all that news about deep seek a couple of weeks ago
01:03:25.220
and, uh, deep seek in still in most places around the world is the number one most downloaded
01:03:31.180
You know, it's downloaded a ton and everywhere around the world, frankly.
01:03:37.840
And so it's starting to rival a lot of the American AI systems also because it's free
01:03:42.320
and, you know, it, uh, it kind of like shocked the world.
01:03:45.760
So right now, if you kind of look at it, um, U S and China are, are a little bit neck and
01:03:52.800
Maybe the U S and America is like a little bit ahead and you kind of like look at, you
01:03:57.820
know, if you go back to each of the three pieces that I talked about.
01:04:00.360
So, um, the chips and the computational power, the data and the algorithms, um, if you were
01:04:06.600
to rack and stack U S versus China on each one of those, you know, we're probably, we're
01:04:12.380
ahead on the computational power because the United States is the leader at developing the
01:04:17.540
chips and, and most of the most advanced chips are American chips.
01:04:20.920
Um, they probably beat us out on data, um, cause China, they've been investing into data
01:04:29.860
And then on algorithms were basically neck and neck.
01:04:36.600
Um, and you know, to your question about does it matter or what is, what does this mean?
01:04:42.240
Um, I, uh, I think it's actually going to be one of the most important, uh, you know,
01:04:48.400
questions or most important races of our time is, is it U S or Chinese AI that wins?
01:04:54.920
Because, um, you know, AI is more than just being a tool that, uh, that, you know, we all,
01:05:01.060
we can all use to make our, you know, build whatever we want to, or make whatever ideas we
01:05:06.080
It's also, um, you know, it's a, it's a cultural, uh, staple, right?
01:05:12.320
You know, if you talk to an AI, that AI is kind of a reflection of our culture and our
01:05:18.880
So in America, we value free speech and, you know, the AIs are, you know, need to, are
01:05:24.840
Whereas in, in China, there's, there isn't free speech.
01:05:28.780
And so, um, you know, if, if the Chinese AIs are the ones that take over the world, then
01:05:33.780
all these Chinese ideologies are going to become exported all around the world.
01:05:38.860
And, and so, so first is there's a couple of dimensions here that I think matter.
01:05:43.020
So first is just the cultural element, which is like, do we want kind of democracy and free
01:05:48.340
speech, um, to be the, the cultural AI that wins, or do we want sort of the more, um, you
01:05:54.400
know, frankly, totalitarian AIs in China to be the ones that win.
01:05:58.240
And then there's sort of the, um, there's like the, uh, you know, you start getting
01:06:04.800
So AI is going to be something that helps, um, all the companies in the United States
01:06:10.580
And so if the USAI wins, then we're going to, you know, the economy will grow faster.
01:06:15.960
We're going to have more and more opportunity, you know, the country will still be better
01:06:20.060
Um, and, and, um, the economy will keep growing.
01:06:23.200
We're versus if Chinese AI wins, then Chinese economy is going to grow way faster than the American
01:06:28.080
So there's sort of the, the cultural piece, the economic piece.
01:06:31.220
And then lastly, there's, there's the, there's kind of the warfare piece, right?
01:06:36.120
And, you know, AI, we haven't really talked about it, but has clear potential to be used
01:06:44.120
And we don't want, you know, we don't want, uh, another country to have, because they use,
01:06:50.480
they have better AI to have a much stronger military than, than America's.
01:06:55.980
How would they have a better AI or how would they use it to have a better military?
01:07:01.360
How would they use it to have a better military?
01:07:03.160
Like, why is that kind of a concern or potential concern?
01:07:06.420
So, so one of the things that's been happening over the past, you know, decade for sure is,
01:07:12.100
uh, is lots of hacking, cyber hacking going on.
01:07:15.540
So, you know, in America, um, even recently we had this huge cyber hack called salt typhoon,
01:07:22.140
where, um, where the Chinese hacked our, uh, telecommunications company.
01:07:31.060
And they got all sorts of crazy data as a result of that.
01:07:39.380
Salt typhoon is in, is widely understood to be operated by the China's ministry,
01:07:43.980
Um, it's foreign intelligence service and secret police Chinese embassy denied all allegations
01:07:48.780
saying it was unfound and irresponsible smears and slanders, um, high profile cyber espionage.
01:07:56.660
Um, in 2024, U.S. officials announced that hackers affiliated with salt typhoon had access
01:08:01.860
to computer systems of nine U.S. telecommunications companies later acknowledged to include Verizon,
01:08:06.840
AT&T, T-Mobile, Spectrum, Lumen, Consolidated Communications, and Windstream.
01:08:11.380
And the hackers were able to access metadata of users' calls and text messages.
01:08:19.180
Including date and timestamps, source and destination IP addresses.
01:08:26.860
And phone numbers from over a million users, most of which were located in Washington, D.C.
01:08:33.480
In some cases, the hackers were able to obtain audio recordings of telephone calls made
01:08:38.940
Such individuals reportedly included staff of the Kamala Harris 2024 presidential campaign,
01:08:44.640
as well as phones belonging to Donald Trump and J.D. Vance.
01:08:47.740
According to Deputy National Security Advisor Ann Neuberger, a large number of the individuals
01:08:52.520
whose data was directly accessed were government targets of interest.
01:08:59.120
So do you think this also, that that whole thing could be not real and it's just a story
01:09:08.100
Because there's real, like, I mean, there's like 20 stories where the Chinese have hacked
01:09:16.240
Like they hacked, this was, this must have been close to 10 years ago now, but the Chinese
01:09:21.620
hacked the database in America that stored all of the clearances.
01:09:27.420
So they hacked in, they, they managed to hack into knowing who are literally all of the Americans
01:09:49.940
Well, that's a great point of operation to go then.
01:09:55.220
So, so they, so already China is hacking the shit out of America.
01:10:02.640
It's exciting kind of, I mean, it's unfortunate, but it's also exciting.
01:10:05.380
I like some espionage, you know, I can't sleep unless somebody's fucking really going
01:10:14.140
So AI can be used to do that because you can like prompt it to go and do things like that.
01:10:23.280
There's a bunch of recent demonstrations where AI is just like how in, in go, how AI beat
01:10:29.620
the world's best go players, AI starting to beat the world's best cyber hackers and
01:10:39.940
I didn't, but I DMed with Magnus, Magnus Carlson.
01:10:46.840
It's, it's just, it, it shows like all this hacking stuff and like, you know, cool.
01:10:53.880
But, but yeah, no hacking, like, like you're going to have AI that are hacking everything
01:10:59.340
And this is one place where this is like US versus China will be really come to life,
01:11:04.740
which is who has better AI that's better at defending against the hacks from the other
01:11:09.120
guy, as well as hacking the other, the other guy's systems.
01:11:11.880
Um, that's going to be, that'll just, that'll just start happening.
01:11:15.420
That's basically starting to happen right now or it's, you know, cyber warfare has been
01:11:18.820
happening and then AI cyber warfare is going to start happening.
01:11:22.900
Basically as soon as, you know, as AI gets better.
01:11:26.620
We had a Craig Newmark who created Craigslist on.
01:11:30.120
And he was talking about how, what if they hacked like everybody's Teslas to all just drive
01:11:36.840
off a cliff one day, or they hacked everybody's ovens to go up to 400 degrees in the middle
01:11:42.740
And then fire started like just things like that, that you don't start to think about
01:11:47.300
Um, once something's connected to the grid or something like that are connected through
01:11:50.600
routers and wifi is that, that, that could be feasible.
01:11:53.060
Yeah, no, it's, there's a lot of, um, there's a lot of things they could do that won't even
01:11:57.480
like seem like that big a deal at the time, but could be really, really could, could
01:12:04.020
So for example, let's say the Chinese, like they just took out all of the military, uh,
01:12:09.980
like communication systems and all the military software systems, um, like took out the satellites,
01:12:16.700
And in those 10 minutes, they like, you know, invaded somewhere or they like did some crazy
01:12:21.860
Like they can just, there's, there's the, the thing about, um, about this stuff is like
01:12:27.960
everything at, you know, as the world's become more connected, it also enables, you know,
01:12:37.760
Also like the, um, uh, you know, uh, information warfare is another big one.
01:12:45.660
So this is information warfare is all about, you know, um, in a place, what is, what are
01:12:53.600
This is kind of gets to like, you know, the, like propaganda or, you know, these conspiracy
01:12:57.920
theories, like what are the stories that, um, in a place that we're trying to make happen
01:13:05.120
And we know that China does a bunch of information warfare called IW it's sometimes called, but
01:13:13.760
They have like, they've, they've hired the, the Chinese military at various points has hired
01:13:19.400
millions and millions of people who are supposed to be on like various, like chat groups and
01:13:25.120
WhatsApp groups and WeChat groups and whatnot, and just, um, spread the right kind of stories
01:13:31.300
that'll make it such that they can like, um, they can make their political aims happen.
01:13:36.200
So for example, in, uh, when, when China wanted to start like, kind of like, um, uh,
01:13:43.380
I don't know what the word is like, uh, like, uh, when Hong, when they, when China wanted Hong
01:13:48.480
Kong to become a part of China again, which happened just not to, not to, you know, pretty
01:13:55.860
When they want to propaganda, is that the word you're looking for?
01:13:59.500
They would use a lot of propaganda and that's information warfare to be able to just make
01:14:03.940
it such that that all happened much more easily.
01:14:07.740
I'll see stories even about, I'll be going through TikTok and see a story come up about
01:14:13.940
Some of it looks fun, but never was a part of my existence.
01:14:17.240
And then you'll see hundreds of people have said something about like, and they'll, and they'll
01:14:24.640
But I, so yeah, it's amazing to think of how many things we're watching or absorbing
01:14:29.520
that are just, are, are created just to dilute us.
01:14:50.380
Some things makes life scary, but then it also makes it interesting.
01:14:53.120
You know, it also makes it interesting in a, in a, in a, in a fun way.
01:14:56.520
Um, how do we, how much do we have to fear, say if a certain country or a certain company
01:15:03.340
owns an AI right in that country and that company, um, if they're Chinese, if they, um,
01:15:10.940
have a certain religious belief or they have, uh, information that they don't, they want
01:15:17.980
to adjust history, how much would a company be able to like, say they keep certain data
01:15:24.780
out of their information system, but, and then after a while, if you're, if that's a company
01:15:31.180
that kind of takes the lead in AI or one of the main ones, then the truth could disappear.
01:15:39.020
Is that true that if somebody loaded it just with the data that wasn't factual, that we could
01:15:44.840
start to not have the truth, is that, does that make any sense or no?
01:15:49.860
I think this is, this is something that, um, it's definitely the right thing to worry about.
01:15:54.200
So, so first off, if you ask any Chinese AI system, so any AI system that comes out of
01:16:01.220
China, if you ask any of them about, you know, a question about President Xi, the, you know,
01:16:07.120
the, the leader of the Chinese government, or you ask them any question about, you know,
01:16:11.160
Tiananmen Square, or, you know, all these like key historical or, or, you know, cultural
01:16:16.520
things relevant to China, um, it'll say it can't talk about them because there's regulation
01:16:22.700
in China that if you talk about some of these things, like you're, you're going to get shut
01:16:29.220
There's like cases where, um, the Chinese government disappears people, um, which we don't know what
01:16:36.300
So there's the, the, this is part of the thing that's worrying, especially about, um, China
01:16:43.320
versus us, even before you get into any of the military stuff that we're talking about,
01:16:47.660
it's just like the Chinese, Chinese AI systems are censored and are going to, you know, be,
01:16:55.480
you know, they're going to, they're going to erase certain historical elements or they're
01:17:02.640
You know, you ask it, is president Xi of China a good guy?
01:17:09.840
Not only does it say it's beyond my scope, it says, let's talk about something else.
01:17:15.400
That's a good, that's a great, Hey, let's talk about something else, huh?
01:17:22.560
Um, but, and people always, people also have to remember about China that they are, that's their
01:17:29.780
So sometimes when people are like China does this, but that's how they're built, right?
01:17:33.480
They're built to like only give out certain information to their people and to, um, have
01:17:40.620
So, I mean, but that could also happen with American companies, right?
01:17:43.440
We can have an American company that owns it and they only want certain information in
01:17:47.480
Like China, that's probably going to be, cause that's their MO sort of.
01:17:56.500
So, so, um, like there were, there are these stories about how there were Chinese news
01:18:05.100
And they would, once a Chinese news site, um, accidentally led an article about, uh, President
01:18:32.120
But if you talk about this in China, you like are risking your life.
01:18:36.300
So what happened, what happened when this happened, this happened on a, on a news site
01:18:40.220
And then the, uh, the, the CEO of that company, like they shut down the whole app was shut
01:18:46.760
down for like a week, um, in the aftermath of that.
01:18:49.720
And then the, the CEO disappeared for a week and, uh, we don't know what happened to him.
01:18:55.460
But then as soon as he came back, he was like, there was like this weird video where
01:18:59.280
he was like, you know, super apologetic and apology.
01:19:07.280
So, um, so in China there, it's like, this is the government has control.
01:19:13.580
You know, you don't have AI, CIS companies, AI, any companies that can, that can talk about
01:19:21.120
So it's heavily regulated there where it's not the, that's not the case here.
01:19:24.880
And this is, I think we have to be diligent and make sure this continues to be the case.
01:19:30.000
Just to interrupt you, but so we get at the point in, does Winnie the Pooh look like any
01:19:39.140
I'm an AI assistant designer to provide helpful and harmless responses.
01:19:42.620
Whereas the chat GBT says, Winnie the Pooh has often been compared to world leaders,
01:19:46.700
particularly Xi Jinping, Xi Jinping, president of China.
01:19:56.900
So it just shows you how that can easily happen.
01:19:59.000
And this is kind of a, this is like a, uh, a relatively innocuous example, but.
01:20:09.620
But there's stuff where like, um, like in China today, they have large scale, effectively
01:20:15.820
concentration camps and reeducation camps for the ethnic minority in China, the Uyghurs.
01:20:30.280
They're recognized as the titular nationality of the, um, of a region in Northwest China.
01:20:36.820
And they've, they're sending them to rehabilitation camps to change their views and information.
01:20:46.240
Since 2014, the government of the PRC, People's Republic of China, has committed a series of
01:20:51.340
ongoing human rights abuses against the Uyghurs and other Turkish Muslim minorities in
01:20:56.420
Xinjiang, which has often been characterized as persecution or as genocide.
01:21:06.740
Mass detention, government policies, and forced labor.
01:21:09.380
And they're just trying to change the way that they think and view stuff.
01:21:14.660
It's just like erasing their culture, you know, pulling them into China.
01:21:22.120
And that's just, that's the craziest thing about history.
01:21:24.240
It's like every place is guilty of this same thing.
01:21:30.260
So it's, it's hard to point fingers, you know, I mean, you can point them, but you have to
01:21:35.800
But that's the thing where if you ask, like it, if you ask a Chinese AI, it's not going
01:21:44.160
Whereas thankfully in, in America, at least when we see people or groups of people or countries
01:21:53.920
We can make sure it doesn't happen in the future.
01:21:56.040
Um, so that's part of the, uh, that's one of the things that's, that could happen.
01:22:02.320
It's like you, you could have, I mean, it's kind of dystopian, but you know, I think there's
01:22:06.900
a real case where let's say the Chinese AI is the winning AI.
01:22:12.460
And then all of a sudden we're like, we were shut out from information about what are like
01:22:18.020
awful things happening in the world or what awful things the government's doing.
01:22:20.720
Like we might just not be able to know about what's going on.
01:22:24.200
And you know, what's weirdly, and I hate to say this, maybe it, maybe it's silly.
01:22:30.460
And sometimes, cause sometimes it's like, you're, you're overwhelming.
01:22:34.760
You're so inundated with the overwhelmingness of what's often is not the best stuff.
01:22:40.180
Sometimes you get a lot of humor stuff too, in social media reels, but you can get scrolling.
01:22:49.840
It's like, Hey, we, you, this isn't, we know this information probably to make you feel
01:22:54.420
They're just a machine, but you know, it doesn't, it adds stress to your, it makes you
01:22:57.520
agitated towards a group or ethnicity or something or yourself even.
01:23:02.320
And then you continue to, it continues to feed it to you.
01:23:06.160
Do you fear that that could happen to AI from our government?
01:23:09.760
Like, have you been approached by the government to try and cause you work with the government
01:23:15.940
We work, yeah, we work a lot with, with the government to make sure that they're using
01:23:20.380
these AIs and they're actually like, you know, to, as to my point on, we don't want
01:23:25.060
trying to get the jump on us on AI use for all these, you know, all these nefarious purposes.
01:23:30.400
So we got to make sure that our AI is, is, is advancing faster.
01:23:34.920
Is that one of your biggest employers or is that employer employee?
01:23:43.960
Um, not our biggest, but, uh, but they're an important one.
01:23:46.680
I mean, I, I grew up in a government lab town, so it's, uh, so it's just part of, it's also
01:23:54.160
You've known about the relationship between government and, um, technology and technology.
01:24:00.960
But, uh, no, I don't think, I mean, I, dude, you should be a superhero almost, dude.
01:24:12.520
I just asked, uh, DeepSeek, who are the Uyghurs?
01:24:16.320
And at first it spit out like a Wikipedia response.
01:24:19.240
It said there were people and there's been like persecution from China.
01:24:24.980
And then it gave this, I was waiting to pull it up and it went away.
01:24:29.240
Um, do you, has the government tried to say that we need to make sure that like, could
01:24:34.120
that happen in our country where the government also curtails?
01:24:39.620
I obviously like, you know, you gotta, we have to, we have to make sure that we uphold all
01:24:44.640
And that we maintain free speech and we maintain free press and all these things.
01:24:48.420
But, um, as of right now, no, I don't think, I don't think that's a risk in the, in the
01:24:55.640
Um, you hear about like chip makers, NVIDIA all the time, Taiwan, that place is just a
01:25:06.620
So one of the, the biggest companies in the world is this company called, uh, Taiwan
01:25:16.000
So they're, they're, um, I mean, it's, it's like a trillion dollar company, uh, based in
01:25:21.280
And it's, uh, that is where almost all of the high-end chips for AI that, you know, we're
01:25:27.600
kind of, we were kind of talking about, all of them are manufactured there.
01:25:30.920
They have these, they have the most advanced, think about them as factories, like the most
01:25:37.160
They're called fabs or fabricators, but basically these huge factories that are like, you know,
01:25:43.980
So, um, they have the most expensive machines in the world.
01:25:46.760
They're machines that cost hundreds of millions of dollars in there.
01:25:49.360
They have, um, they build them because, uh, so that, you know, the chips, they have to
01:25:55.260
be made at, at the like finest levels and very, very precisely.
01:26:02.040
You need, well that, and there's like these machines that, um, that, uh, that,
01:26:05.120
at the nanometer level make like little marks and etches on top of, uh.
01:26:17.160
But the, but it's so, it's the, the machinery is so precise that, um, even if there's like
01:26:23.520
a little bit of like seismic movement, a little earthquake or a little bit of movement, it
01:26:29.080
So they have to build the, the buildings, build the factories in a way such that there's
01:26:35.340
like, like the whole building doesn't move, even if there's like a little earthquake or
01:26:41.280
So it's like, it's this crazy, crazy engineering.
01:26:45.100
Um, and so that's, so these, all these giant factories are in Taiwan and that's where basically
01:26:51.320
like a hundred percent of all the advanced AI chips are made.
01:26:57.040
But then the, the reason it's a hotbed is that, um, uh, the, the People's Republic of
01:27:10.420
There's a, there's a complicated relationship between Taiwan and China where, you know,
01:27:15.660
if you ask people in Taiwan, they, they want to be independent.
01:27:18.140
They want to be their own country, but, um, but the People's Republic of China has a, um,
01:27:24.860
sort of a reunification plan that they want to bring Taiwan back into their country and
01:27:31.640
So it's kind of, you know, it's kind of like it potentially, you know, thankfully there's,
01:27:41.320
It becomes like Russia, Ukraine, or, you know, one of these really, really bad situations.
01:27:49.140
What's scary is that, um, that, that China, a China wants to, you know, either invade or
01:27:59.560
Um, and there've been, you know, um, president Xi has, has ordered his military to get ready
01:28:08.320
Um, now we don't know what's going to happen, but, you know, if a extremely powerful world
01:28:14.800
leader says to get something ready by 2027, you kind of, you know, read between the lines
01:28:19.940
Um, and, um, and that's part of it is, is, uh, obviously it'd be, you know, we don't want
01:28:29.220
But then the other thing that's scary is, um, China may view it as a way to just like
01:28:35.440
win on AI, because if they take over the island with all of these very, these giant factories,
01:28:41.700
all the chips, baby, they'll get all the chips, Frito Lamborghini, baby, they'd be running
01:28:49.980
Cause you kind of hear about it in the whispers of like a potential place where there could be
01:28:55.640
And there's, there's all these reports about how, um, China's, they're, they're stacking
01:29:00.960
up tons and tons of military, um, right on their, their, their coast to, you know, that's
01:29:20.180
We're so blessed to have a place where at least we can sleep in peace, even if we're
01:29:23.980
uncomfortable at times in our brains, you know, to not have that constant threat.
01:29:29.580
So you don't think, you don't think that you don't worry that the government will regulate
01:29:37.960
I think, I think, uh, I think we're focused on how do we make sure that America wins?
01:29:43.620
How do we make sure that, uh, that the United States comes out on top and that we enable
01:29:50.960
Would you think they could regulate the amount of chips that you're allowed to have?
01:29:53.980
So this is a hot topic globally, actually, which is, yeah, yeah.
01:30:07.460
This is one of the hottest topics in DC right now, uh, is what, what are we going to do about
01:30:13.800
how many chips other people are allowed to have?
01:30:16.180
Because, because almost all the chips are American chips.
01:30:27.980
They're, China has their own, uh, has their own chip industry, but it's behind ours.
01:30:33.360
So, so the United States has the, has the most advanced chips as the, you know, these
01:30:41.380
And, uh, the, one of the big questions is, are, you know, do we, does the government allow
01:30:48.160
a lot of these chips to go over overseas to China or parts of Asia or the Middle East or
01:30:53.660
wherever, or do we want to make sure they stay in America and make sure that we win in
01:30:58.920
America and this is a super duper, you know, they're called export controls.
01:31:11.940
Um, it's a complicated, complicated thing because basically, you know, one argument is,
01:31:18.980
um, we shouldn't be throwing our weight around in this way.
01:31:26.120
Like if other people want our chips, they should be able to get our chips.
01:31:28.800
And that way, you know, the world is running on American chips that, that can be good in
01:31:35.100
And it helps make sure that, you know, helps bolster our economy, our industry.
01:31:38.800
Um, but the other way to look at it is, Hey, AI is really, really important that America
01:31:44.900
And we don't want to like, let's not give other people any advantages or let's make sure
01:31:50.860
that we win and then, and then we can figure out what we're going to do with all the chips.
01:31:57.500
And there's like all sorts, you know, even beyond that, there's like 50 different arguments
01:32:04.540
But, um, you know, where I go, where I come from on it is like, let's, let's make sure
01:32:09.600
America wins and let's start from there and then figure out what we need to do to win.
01:32:15.860
Are there uses of AI that you feel like cross the line kind of?
01:32:19.360
Um, I definitely think like, uh, well, I, I worry a lot about this kind of like, uh,
01:32:31.440
Like, I don't want, I don't want AIs that are specifically programmed to make me think
01:32:38.880
a certain thing or persuade me to do a certain thing.
01:32:43.760
So I'm really worried about this kind of like deception and persuasion from AIs.
01:32:49.280
Like I don't want AIs that are lying to me or, uh, that are sort of, that are like, that
01:32:55.640
are kind of like nudging me or, or persuading me to do things that I don't want to do, or
01:33:02.900
We don't realize how easily we're influenced little things that influence us.
01:33:06.800
And even just a turning of a phrase or a little bit of this, or pointing you to a couple
01:33:11.600
of lengths in your life could lead you down a whole world.
01:33:16.360
So people that could, people that had, how do you keep your AI clean?
01:33:24.940
Well, this is where it goes back to a, the data.
01:33:27.440
So you got to make sure that that data, to your point, the large body of water is as
01:33:42.320
Um, so, so the big part of it is about the data.
01:33:45.000
And then the second part is, I think we have to just, we have to constantly be testing the
01:33:49.560
So, um, we have to, we have to like, we constantly are running tests on AI to see, Hey, is there,
01:33:59.000
You know, one of the, one of the tests that we run a lot, uh, is, uh, and this is like,
01:34:04.320
you know, across the industry is like, um, are AI's helping people do really nefarious
01:34:12.580
So, you know, if somebody asks an AI, Hey, help me make a bomb or help me make like a,
01:34:18.440
like COVID like 2.0 or whatnot, that the AI is not helping you do that.
01:34:23.860
So, um, so we run a lot of tests to make sure that it doesn't help in those areas.
01:34:28.320
And then, um, and then we make sure that the data is really clean so that there's no, there's
01:34:34.040
no sort of like little bit or piece of that that makes its way to the model.
01:34:40.680
With Outlier, how are you, how are, what type of people are applying for those jobs?
01:34:45.360
Can people just log on and start to submit applications?
01:34:48.500
Like, how does that work to become a, um, information sourcer?
01:34:56.160
Everybody's kind of contributing to the AI's, everyone's contributing to the data, um,
01:35:01.400
Um, it's kind of like, I almost think of it as like the next generation of Wikipedia, right?
01:35:15.480
Well, it turns out, by the way, most of the AI's don't really, um, speak other languages
01:35:22.260
They're much, much better at, uh, at English and, and, uh, particularly American English
01:35:32.220
And so we want to make sure that they speak all these languages well and that there's these
01:35:37.600
So, um, anybody in the, around the world can log in and sort of, there's a little bit of
01:35:44.040
a, of a, of a, um, like orientation almost on, uh, on how to best, um, like what you're
01:35:51.540
supposed to do, how you're supposed to do it, what expertise should you be bringing, all
01:35:55.420
And then, and then you can just start, um, contributing to the, to the AI models and you
01:36:02.820
And it's going to be, I mean, I really think, I legitimately think jobs from AI are going
01:36:08.080
to be the fastest growing jobs in the world for the years to come.
01:36:13.040
Like what, so jobs where people are able to contribute information jobs where people are
01:36:16.820
able to like, what, like what would examples of those be just some of the ones you've already
01:36:21.760
Like contributing to the AIs, um, helping to utilize the AIs and helping to, to shape
01:36:27.600
the AIs into, into applications or into, you know, uh, into in like helping organizations
01:36:38.300
Um, helping to, to manage the AIs and make sure they're, they're, they're on the straight
01:36:45.860
Who's some, who's getting to college or has, doesn't even want to go to college, but this
01:36:49.700
is the world they want to get into and be one of those people.
01:36:55.980
Like the, uh, the, uh, like, uh, outlier, we only started a few years ago.
01:37:01.540
So all of this is happening so, so quickly, but what we want to do ultimately is make it
01:37:06.060
easy for anybody in the world to, you know, gain the skills they need to be able to do
01:37:12.220
this work well, to learn what it means, what it does, and ultimately be in a position where
01:37:16.360
they can, they can, you know, help build the AIs and then, and then keep improving that
01:37:21.160
and gaining mastery and getting better and better at it.
01:37:26.860
Like how does someone's start to become, you know, just get a head start on what could
01:37:33.240
potentially be probably a lot of job opportunities, I'm guessing.
01:37:42.360
Like, no, it's everybody because yeah, as we were talking about, like AI needs to get
01:37:51.620
Like, is there, do you know, like, is there a specific places where people can, cause that's
01:37:55.220
another thing I think it's like, I'm going to work in AI and you're like, what do I do?
01:38:01.640
I mean, we, we would definitely love to help build them.
01:38:04.660
Um, so I guess if any colleges are listening, you know, and you want to help figure out
01:38:09.100
about these programs, we love, we'd love to help.
01:38:11.160
That'd be pretty cool if you had your own kind of like course, not that you had to teach
01:38:13.980
it all that, you know, but you were like a partner of it somehow.
01:38:16.620
I mean, I think we'd love to, to basically teach everybody in America how to best contribute
01:38:21.860
to the AIs, how to, how to best basically take advantage of the fact that this is going
01:38:30.240
They're going to be shaped a little bit different from, you know, the jobs that exist today,
01:38:34.080
but you know, it's not going to be that hard for everybody to learn and figure out how to
01:38:39.740
What are some jobs that could, that could be at risk because of AI, right?
01:38:42.760
Cause you start thinking that like, yeah, before I was talking to you, there was this general
01:38:48.940
Um, but when you think about it, you're like, yeah, these are some jobs that, I mean, they
01:38:52.440
won't disappear, but there might be less of them.
01:38:55.300
I just think it'll be, um, it'll be, we'll be doing like a different thing.
01:39:00.720
Cause a lot of our fans are probably just blue collar listeners.
01:39:02.700
Like, like there's people that work in like, you're not going to, you're still going to
01:39:07.700
You're still going to need anything where you have to physically do something.
01:39:12.480
And then even stuff where you're like, let's say you're, you're mostly just working on a
01:39:16.600
laptop and you know, um, even for those jobs, like it'll just change.
01:39:21.440
Like instead of being, um, instead of my job being like, Hey, I have to do the work.
01:39:27.480
It'll almost be like everybody gets, um, promoted to being a manager.
01:39:31.720
Like, because I'm going to be managing like a, a little pod of 10 AI agents that are doing
01:39:39.100
the work that I used to do, but I need to make sure that all of them are doing it right.
01:39:41.920
And that, um, they're not making any mistakes and that, you know, if they're making mistakes,
01:39:46.380
I'm helping them, you know, get around those mistakes.
01:39:48.420
Like, like, it's just going to, I, the way I think about it is that like, yeah, like
01:39:52.860
literally, um, over time, everybody will just be upgraded to being a manager or sort of promoted
01:40:02.900
Cause I think that what's the other thing that's going to happen is the, um, the economy is
01:40:08.980
Like there's going to be, there's going to be so much, like there will be like industries
01:40:16.940
And so, you know, the limit is going to be how many AIs can you have?
01:40:22.400
And then you're going to be limited for, in terms of the number of AIs you have by the
01:40:28.140
So, um, it's gonna, uh, it's gonna, cause you need air traffic controllers.
01:40:36.600
Well that, that definitely, but, but you're right.
01:40:39.480
But I mean, in any field, you're going to need like just more managing, more people to
01:40:43.740
oversee and make sure that these, that different things are happening because some of the smaller
01:40:50.300
And just so much more stuff is going to be happening.
01:40:53.920
Because yeah, once these things are all kind of taken care of, more things can happen at
01:41:00.960
Once some of the things at the first level of, of, of, of certain businesses are handled
01:41:05.620
more easily by AI, then you're going to be able to have more people operating at a higher
01:41:14.160
It's kind of like, it's kind of like always the history of technology.
01:41:16.900
Like when, when we started developing technology that, that started, um, making farming a lot
01:41:23.580
more efficient, all of a sudden, um, you know, people could do a lot of other things other
01:41:29.840
And then, you know, all of a sudden we have big entertainment industries and big financial
01:41:34.380
industries and, you know, barbecue cook-offs, man.
01:41:36.820
I'll tell you that second, some of those guys got the weekend off, they was grilling
01:41:44.400
Um, yeah, everybody leveling up to be managers and then also everybody, you know, just way
01:41:50.840
Like way more ideas are going to start becoming a reality.
01:41:53.560
And so, um, it'll be, I think it'll be pretty exciting.
01:41:57.500
Like, I think it's just like a lot more stuff is going to happen.
01:42:01.900
Like, are there companies where, um, you're the youngest billionaire in the world ever or
01:42:17.280
Uh, according to like some publications, but I don't know.
01:42:22.380
And you've been very successful, you know, the, uh, self-made, um, billionaire, and we
01:42:29.920
If you decide you don't want it in, um, I just don't know how certain people feel about
01:42:33.720
Um, and the founder of, of, of scale AI, where do you invest?
01:42:39.460
Like, are you investing your money in certain places, like certain fields that you see continuing
01:42:45.580
Um, so I, most of, almost all of what I'm focused on is like, how do you invest in
01:42:52.820
But, um, and make sure that we, you know, one of the things I'm, I think is really important
01:42:57.120
is like, how do we make sure we create as much opportunity through AI as possible?
01:43:01.580
Like, how do we make sure there's as much jobs?
01:43:03.040
How do we make sure that everything that we're talking about actually is what happens?
01:43:06.680
Because I think, no, someone's gonna have to really work to make sure all these jobs
01:43:10.560
show up and all this, all this stuff actually happens the way we're talking about.
01:43:13.560
So there's gonna be new industries that are going to pop up even.
01:43:16.800
I mean, I think like just in the same way that, um, you know, it's hard, nobody could
01:43:22.680
have really predicted that podcasting was going to be this huge thing and this huge cultural
01:43:32.240
And, uh, and that's going to happen in like little ways and all sorts of different industries.
01:43:36.900
Um, and that's going to be, it's going to be really exciting.
01:43:39.760
Uh, what are some of the things that excite you about technology right now?
01:43:42.940
Like what are, where do you see like AI and technology in five years, 10 years?
01:43:48.580
So, um, some of the areas I think are really, uh, are really exciting.
01:43:52.160
So one is definitely everything to do with healthcare and biology that that's moving really,
01:43:59.220
And kind of, as we were talking about, like, I, I think legitimately in our lifetimes, we
01:44:08.360
Like we could see some really crazy leaps and advancements, um, in that area, which
01:44:16.100
Could it create a way that we could live forever?
01:44:19.340
Uh, I mean, it's just a, there's definitely people working on that.
01:44:22.260
Um, you know, uh, there's, so this is getting kind of crazy and very sci-fi, but, um, some
01:44:30.160
people think that there's, there's a way for us to keep rewinding the clocks on ourselves
01:44:36.920
so that we'll always feel young and like all of our cells will actually always stay
01:44:43.560
Um, it's, I think scientifically possible, but, um, and, and I think if we can get there,
01:44:53.320
I think that's, uh, at, at the very least, I think we'll be able to lengthen, um, our,
01:44:58.860
our lifespans pretty dramatically and, and maybe we could get to that.
01:45:04.220
Cause I always envision this, there's like a time where it's like, okay, this group lives
01:45:08.840
And there's just that parting of two different way, you know, people head not in just into
01:45:14.720
And then there's other people just loiter who are going to be loitering around for a long
01:45:18.840
And what that would be like that cutoff, you know?
01:45:21.420
I, yeah, it's kind of, I mean, I, I'm not that worried about it, but it, it sucks to
01:45:26.000
be that cutoff where like, maybe, maybe, yeah, maybe, yeah.
01:45:35.200
I mean, dying, you're just an astronaut really into the ether.
01:45:41.240
You'd have Lewis and Clark at a Lord at that point you were out there.
01:45:44.740
And then if you stay, you kind of are always going to be no, you always know kind of what's
01:45:53.480
But then after a while you might be like, dang, what happens if you die?
01:46:00.180
How did, do you see AI having any effect on religion?
01:46:04.640
I think, um, I think one of the things that, uh, something I believe is like, I, I think
01:46:12.120
that as AI becomes, um, uh, you know, this is one of the things I think is really important
01:46:19.700
is that we are, are able to educate people about AI and help people understand it better
01:46:25.980
Because, um, I think it's this scary thing that nobody understands or people feels like
01:46:31.720
a boogeyman or, you know, feels like is, is there's just this like thing that's going
01:46:40.160
And I think that that affects people's spirituality that affects how people, you know, contextualize
01:46:48.220
If your purpose is a job that you feel it's going to disappear.
01:46:51.480
That could already be causing you to feel that way.
01:46:56.300
So, but I think, I think if you, if we can explain AI more and ultimately like,
01:47:01.720
um, like it is, it is a cool technology, but it's not that magical.
01:47:05.980
It's just, you know, it's like data and you crunch that data and then you get these
01:47:10.340
And so it's not like, um, it's not, uh, yeah, some people talk about in this crazy
01:47:17.660
way, but, but I think as long as we are able to explain what AI is and also explain what
01:47:23.660
And to me, it's about getting this like relationship between humanity and AI, right?
01:47:29.920
Like, how do we make sure that this is something that enables us as humans to be more human
01:47:36.460
and us as humans do more and experience more and be better and all these things versus something
01:47:41.840
that kind of, um, is scary or will take over or anything like that.
01:47:48.580
What's a place like, so if I go to chat GBT, is that scale AI?
01:47:54.060
So we, so we're actually kind of under the hood powering all the different models and
01:48:00.640
So are all the different AI like systems under the hood?
01:48:08.380
We help power, uh, mostly from the data perspective.
01:48:11.260
So do we know if the answer came from your company or other companies?
01:48:14.320
If we ask it a question, like how do we, uh, there's probably no way to like literally
01:48:19.180
tell, but yeah, we help power, you know, open AIs and we help power Google's AI systems
01:48:23.840
and Meta's AI systems and, um, help power all the, all the major AI systems.
01:48:30.000
How can a regular person just at their home, right?
01:48:32.200
Say there's a guy who's been listening to this today.
01:48:33.740
He wants to go home today and he just wants to learn a little bit of how AI works.
01:48:37.160
He could just go on to chat GBT and ask it a couple of questions.
01:48:43.020
Um, you could ask it how, what's the history of my town?
01:48:45.060
You know, um, can you research my last name maybe and see where it came from?
01:48:49.240
Um, what are, uh, like maybe what are some, um, innovations that could happen in the next
01:48:58.180
There's different little things you can just ask it.
01:48:59.780
That's how you can start to have a relationship with asking and learning about, uh, and you
01:49:05.920
Like right now, AI is still, uh, really bad at a lot of things.
01:49:10.460
This is why, this is why I think when people understand it and really get a feel for it,
01:49:15.540
it stops being as scary because, uh, because I think we think about it as like, we think
01:49:20.700
about it as like the AI from the movies that are sort of like, you know, all powerful and
01:49:24.400
But, uh, yeah, you think of it as a robot that's going to show up and just start driving your
01:49:34.760
Like, I think that's the, there is this boogeyman fear.
01:49:44.280
And, and it's, to me, it's like, we have, we kind of have the choice to make sure that
01:49:50.980
Like we definitely, we in the, like people building AI, but just in general, everybody
01:49:55.540
in the world has like, like we should all make sure to use it as something that like
01:50:00.860
an assistant as something that like helps us versus, um, versus think about it in like
01:50:07.560
Well, getting to learn and learn how to use it small ways, whatever is certainly a way
01:50:12.800
that you're going to start to realize what it is.
01:50:15.020
And it's easy to just sit there and say it's horrible and without trying to use it to learn
01:50:19.860
Sometimes I won't learn about something just so I can continue to say it's a boogeyman,
01:50:24.820
You know, if I, if I choose not to learn about it in my own life, um, chat GBT has become like
01:50:30.360
a proprietary, like, like name, like band-aids or, um, ping pong, is that okay that it's
01:50:41.800
I mean, I think basically like we, there will probably be more AIs over time that people
01:50:47.780
get used to and use and, um, like anything, you know, there'll always be like a, uh, in
01:50:55.260
America, there'll always be a bunch of options for consumers and much options for people to
01:51:00.980
So I think like, um, right now we're just in the very early innings of AI, but over time
01:51:06.740
we're going to have, you know, just like how, um, uh, for, for anything like for, for clothes
01:51:13.900
or for, you know, energy drinks or for whatever, like different people have different tastes
01:51:19.200
because there's going to be different things that different AIs are good at and other things
01:51:29.880
So I think what AI is good at because it's ingested all of the, the facts, right?
01:51:37.460
Like it's ingested this, like the body of water is really, really big and it's ingested so
01:51:43.000
many different facts and information from all of humanity.
01:51:47.520
It definitely like knows more, um, or like, you know, just like how Google knows a lot
01:51:57.200
Um, but it's not like, you know, there's, there's a very, very, there's tons of things
01:52:02.860
that humans can do that AI is just like fundamentally incapable of doing.
01:52:06.660
So, um, so it's not a, it's not like a, I don't think you can even like measure one
01:52:13.260
There's sort of like very different kinds of intelligence.
01:52:15.560
Could AI just create a better AI at a certain point?
01:52:19.020
Like, could it just be like, Hey, AI, create a better AI and it could do that?
01:52:25.080
Actually, this is, this is another really, uh, hot topic in the AI industry is, is can
01:52:32.200
you get AIs to start doing some of the engineering and some of the improvement on its own?
01:52:44.400
It's becoming the water and the, uh, coast guard.
01:52:48.860
So this is something, I mean, my personal thought is, I think this is something we should kind
01:52:53.380
of watch a little bit and we should make sure that humans always are, have the sort of like
01:52:58.240
steering wheel and the, the sort of, uh, control over.
01:53:01.240
Um, because like you're saying, you know, it's like kind of a slippery slope before that gets
01:53:06.300
kind of, uh, you know, a little, a little weird.
01:53:09.080
But, um, but I, I don't, I think that like we can, we, we can maintain that.
01:53:13.240
We can make sure that we don't let the, the AI just sort of like keep iterating and improving
01:53:18.760
And in the end, you can always shut off your computer and phone and go for a walk, huh?
01:53:24.360
It's not like it's going to come out and just, you know, slurp you off or something if you're
01:53:27.960
trying to be straight or whatever, you know, and it's a man.
01:53:32.900
I don't think it has a, I don't think it has a gender.
01:53:39.960
Well, there's, there are companies that try to program the AI to adopt various personas
01:53:49.540
Like, uh, like on, on meta on like Instagram or whatever, you can get, uh, you can get an
01:53:58.640
Um, and it's funny, you know, the Aquafina AI is funny.
01:54:01.820
Um, um, you're a self-made billionaire, which is pretty fascinating.
01:54:06.360
I think, you know, to have, uh, um, I think that money is energy kind of, and it kind
01:54:28.660
Um, like we weren't, we, you know, we weren't, uh, we weren't like, uh.
01:54:38.600
And I think like, uh, I mean, one of the, one of the, um, one of the crazy things about
01:54:44.900
tech and the tech industry is that, you know, we've been, we've been just sort of like building
01:54:51.480
But one of the things that, that happened is AI all of a sudden became like, there was
01:54:57.140
so much progress in AI and it became the biggest thing in the world.
01:55:00.880
Like, I mean, um, all of a sudden, you know, anywhere I go, I'm, everybody is talking about
01:55:07.860
It didn't used to be like that when I started the company.
01:55:09.700
Um, uh, AI was just kind of a, like a niche topic.
01:55:14.020
And now it's like, you know, anywhere you go, like I'll just be walking around and I hear
01:55:19.120
like random conversations about chat GPT and AI and robots and all this stuff.
01:55:23.820
And, uh, and so it's kind of been crazy to be experienced that and be a part of that
01:55:29.700
wave and kind of like, you know, I started working on this company almost nine years ago.
01:55:35.500
So it was like, when I started working on it, it was kind of this obscure thing.
01:55:39.240
And, you know, I always knew that it was going to become bigger, but I, I could have never
01:55:45.660
It's almost like you were just standing on like the bingo number.
01:56:02.400
So at first they were like, you know, at first I dropped out of college, right?
01:56:07.600
And in Asian culture, that's like not a thing you do, right?
01:56:18.960
Uh, like everybody in my family has, they've gone through all of schooling.
01:56:28.600
But they're like, yeah, Alex is not doing good.
01:56:30.960
So they were, they were pretty worried at first.
01:56:32.700
And I kind of, um, I told them a little bit of a white lie that I was like, oh no, I'm,
01:56:40.660
You know, I'm going to finish, I'm going to get this, uh, like tech thing out of my
01:56:45.400
system and they'll go back and I'll finish school.
01:56:48.140
Um, obviously that hasn't happened yet, but, uh, but yeah, they were worried at first.
01:56:56.720
And I, I owe everything to my parents, you know, they're, uh, they're awesome.
01:56:59.620
And like, seriously, they're, they're, my parents are super brainy.
01:57:05.680
They would like teach me about physics growing up and teach me about math.
01:57:08.900
And, uh, that's what led me, uh, be so good at the competitions, you know?
01:57:15.060
I don't even, I think do, um, are your parents or your grandparents from China or no?
01:57:22.680
Did your, does your family have a lot of Chinese culture?
01:57:27.180
This is true for a lot of, um, Chinese Americans is that there's kind of like, there's, uh,
01:57:35.440
And then that's kind of almost like, it's very different from the Chinese communist party
01:57:43.180
Because basically one way to think about China is, I mean, China has been, is a, is a culture
01:57:48.240
and a civilization that's been around for like thousands and thousands of years.
01:57:57.000
But that's very different from the current communist party and the current communist regime.
01:58:03.640
I think most people probably think of, I mean, I don't know.
01:58:09.180
I definitely don't think if I see a Chinese person, I don't think, oh, that's a communist
01:58:18.680
I think somebody has a crazy long history, like, damn.
01:58:22.460
And then maybe almost a, do you feel like there's some people in certain parts of China
01:58:29.440
I mean, the, the Uyghurs that we're talking about, I mean, that's just like horrible what's
01:58:37.320
No, it's, uh, I mean, some of the stuff, like, I mean, we don't really like, first
01:58:41.420
of all, the world is a, is a huge place and there's like all sorts of both great and bad
01:58:50.100
But, um, but yeah, no, I think that like, I think a big part of this is like, we want
01:58:54.600
to make sure we have, we have governments that, uh, that believe in democracy, believe
01:59:02.240
With being somebody that's able to be smart and conceptualize stuff, do you start to get
01:59:08.460
This might be the last question I have for you.
01:59:10.100
Um, do you have any insight on like the afterlife or what happens?
01:59:14.800
Like I never really thought about it for like talking to like a real math guy about that.
01:59:21.740
I think, um, like what's the total, like what's the sum zero game or whatever, you know, what's
01:59:27.880
I guess like the way I've always thought about it because, um, partially because both my parents
01:59:33.480
were physicists was I, I kind of always feel like people live on with their ideas and they're
01:59:41.220
kind of like, like what are the things that, the things they put out into the world.
01:59:46.740
Cause like in, in math, like everything you learn about is like a theorem named after a
01:59:52.380
different person or, uh, or sort of idea named after some, you know, the first mathematician
01:59:58.660
or the first scientist or whatever to figure that out.
02:00:02.400
Um, like Einstein lives on because bagels, that was a shitty joke, but thank you.
02:00:10.060
Because, you know, because the, yeah, all E equals MC squared, all that kind of stuff.
02:00:15.480
So, so like, I know what that kind of stuff is.
02:00:22.600
And so, so I think I always feel like people, um, yeah, you live on by your ideas and, and
02:00:31.360
Um, and that's kind of always how I've thought about it.
02:00:33.800
Um, so you don't get some deeper thought about like, like since your brain is able to
02:00:38.140
be, cause you have probably a unique brain, right?
02:00:40.760
And more, I mean, you know, and everybody has a unique brain, but I've just never asked
02:00:45.980
somebody with your, I've never asked your brain this question.
02:00:49.240
Do you, do you get some further insight about like what you think happens when you die?
02:00:55.020
Well, I think that one of the things that gets talked about a lot in Silicon Valley, um, where
02:01:00.480
I live, especially is, uh, like the simulation, whether it's all simulation and whether like,
02:01:10.800
There's a, there's, there's some, there's some episodes where it gets into this and I
02:01:14.920
think it, it covers it in a, in a pretty good way.
02:01:17.000
But like, you know, what if every, what if all of humanity is just like a, like a, almost
02:01:23.020
like a, an experiment or a video game that, that some other civilization is running?
02:01:28.500
That's kind of the one that, uh, that fucks with particularly like people's mind and tech
02:01:33.700
a lot because we're like every day, all day, every day, we're out there trying to make
02:01:38.860
computers and make simulations and make things that are, that are like more and more sophisticated
02:01:46.140
And so kind of the mind fuck is like, oh, what if everything we know is, is just kind
02:01:51.620
of the, you know, a simulation from some other civilization.
02:01:55.540
Or if we advanced it enough that we're able to make this happen and seem real.
02:02:01.200
So, you know, I think, I think one of the things that, that, um, like with AI and with a bunch
02:02:06.920
of other things, like in, well, even just in the past, like 30, 40 years, video games
02:02:17.300
So it was, we've seen that happen like in a hundred years, like, would we be able to
02:02:23.340
simulate something that feels pretty realistic?
02:02:38.020
Um, have you met any guys like, uh, Jinssen Hong?
02:02:45.120
What is it like when you meet some of these guys?
02:02:51.340
He's the fucking, he's the, he's turning into him.
02:03:12.620
So when we were like a baby company, uh, we threw this dinner in, uh, in Silicon Valley.
02:03:19.500
And, uh, and we just kind of, I kind of YOLO invited him, um, this was, you know, years and
02:03:27.300
And, uh, and I didn't expect it, but he said yes.
02:03:29.560
And he came to our dinner and, uh, he, he came, uh, he came, it was, it was like at this
02:03:37.120
And, uh, he came and, uh, he came and, uh, he, uh, he, uh, he, uh, he went to boarding school.
02:03:45.240
I think he's probably told this story, but like he went to boarding school, um, and his
02:03:49.880
parents, when they came to America, they, they, they wanted to send him to boarding school,
02:03:55.440
Um, and so they just sent him to like the first boarding school that they, they like
02:03:58.540
found on Google or something or like found, it wasn't even Google at the time.
02:04:01.800
So that, that they like heard about, and that boarding school happened to be, um, kind of
02:04:15.380
He's there learning with people who are, um, detoxing.
02:04:19.060
So he, he told me the story about, he told the story about how he was like, he was just
02:04:22.760
like this, this kid, um, at, you know, this boarding school that were like, everybody else
02:04:30.280
And he like, he got by and made, made his way through that school by like, uh, by like
02:04:37.320
doing everyone's math homework and like, you know, kind of like wheeling and dealing that
02:04:42.040
And you could see that he, he learned how to like wheel and deal and sell and all this
02:04:50.160
Cause he was, he was, I mean, his, his, his, his story is pretty crazy.
02:05:00.060
Um, you know, all these people in tech, they're like, uh, I mean, they're all real people,
02:05:08.960
I figured that since you were with Sam Altman, that you were probably a tech guy, you know?
02:05:17.980
Uh, I think maybe somebody said he's in the AI verse, you know, but you just seem like
02:05:21.480
such a totally normal, like I would not have thought that, um, you were just, I don't know.
02:05:26.960
I guess sometimes you think like somebody's going to be, they're going to be like super
02:05:30.160
quiet or, you know, not have a lot of different thoughts, but yeah, it was cool, man.
02:05:39.760
You're probably like my, you might even be my first Chinese friend.
02:05:42.080
I don't think second, probably Bobby Lee, who's denying it, but he'll come around.
02:05:52.620
I bet your, uh, whole family's super proud of you.
02:05:56.320
Thank you for coming to spend the time with us and just helping us, uh, learn and think.
02:06:02.360
And I think like, I mean, we were talking about this before, but, um, I don't want to
02:06:06.140
make sure that like people all around the world, especially Americans aren't scared of
02:06:10.680
AI because it's going to be, it's going to be really cool and it's going to be amazing,
02:06:14.440
but, um, we need to remove the boogeyman component and, uh, and thanks for helping me do that.
02:06:21.640
I think I definitely feel differently about it.
02:06:27.680
So I'm trying to figure out, you know, um, one more question.
02:06:30.820
How do you keep it from becoming like a, like a advertisement trap house?
02:06:34.480
Like the internet's become like the internet's just pop-ups and ads and fucking best buy trying
02:06:42.780
How do you keep that out of like you guys as waters or do you have to go there at some
02:06:49.440
First, I'm hoping that, that, um, we have like the AI industry as a whole avoids advertising
02:06:55.580
as much as possible because, um, because it's kind of, it's very different.
02:07:00.260
Like it is a tool that, that people can use and people can use to start businesses or make
02:07:05.900
movies or make all these like different ideas happen.
02:07:08.360
And I, I would much rather it be a tool that doesn't get sort of, um, uh, that doesn't become,
02:07:14.900
yeah, like an advertising thing versus like, I wanted to make sure it's a tool that helps
02:07:21.320
Um, so that's that, I, I think this is, there's kind of like a, a choice here and, uh, and we
02:07:27.640
as an AI industry just got to, I think, make some of the right choices.
02:07:30.960
I think there would be value in staying as pure as you could, if you could find a way to,
02:07:34.580
you know, if there's other money to be made on the side, it almost seems sometimes like
02:07:40.380
And I think that's, that's like, uh, you know, I, I want to make sure that, uh, that
02:07:46.180
people don't feel like they're being used by the AIs.
02:07:48.180
I think that'd be a really, that'd be really bad if we ended up there.
02:07:50.980
So we, you know, and I, that, I don't think we need to make it like that at all.
02:07:54.940
Like, I think we can, we can make sure the AI is helping you do things is, is super helpful.
02:08:04.760
Like those are things that I think we want to make sure AI stays.
02:08:15.040
Shout out to Danny, um, who came up and Danny lives in Franklin, right?
02:08:22.540
And shout out to, uh, Alex Brusowicz, who, uh, who we met through.
02:08:53.520
Oh, but when I reach that ground, I'll share this peace of mind.
02:09:17.800
But what if I can come to mimics for such a great söndrom,