This Past Weekend with Theo Von - February 18, 2025


E563 AI CEO Alexandr Wang


Episode Stats

Length

2 hours and 9 minutes

Words per Minute

199.33678

Word Count

25,808

Sentence Count

1,920

Misogynist Sentences

10

Hate Speech Sentences

47


Summary

In this episode, I chat with entrepreneur Alexander Wang about his trip to the inaugural, the future of AI, and the importance of social media in the world of business and technology. We also talk about what it's like to be a self-made billionaire at the age of 24.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.000 I want to let you know we've restocked all your favorites on the merch site.
00:00:35.200 Everything is in stock.
00:00:37.320 You can show up theovanestore.com.
00:00:41.780 And thank you so much for the support.
00:00:44.100 I have some new tour dates to tell you about.
00:00:45.880 I'll be in Chicago, Illinois on April 24th at the Wintrust Arena.
00:00:51.540 Fort Wayne, Indiana on April 26th at the Allen County War Memorial Coliseum.
00:00:57.260 And Miami, Florida on May 10th at the Kaseya Center.
00:01:01.960 Get your tickets early starting Wednesday, February 19th at 10 a.m.
00:01:07.080 with pre-sale code RATKING.
00:01:10.360 We also have tickets remaining in East Lansing, Victoria, B.C. in the Canada.
00:01:16.280 College Station, Belton, Texas.
00:01:19.140 Oxford, Mississippi.
00:01:21.700 Tuscaloosa, Alabama.
00:01:23.760 Nashville, Tennessee.
00:01:24.860 Winnipeg in the Canada.
00:01:26.680 All tickets at theovan.com slash T-O-U-R.
00:01:33.600 Today's guest is from Los Alamos, New Mexico.
00:01:36.540 He's a leader in the world of business and technology.
00:01:40.200 He's an entrepreneur.
00:01:41.880 He started Scale AI, a company recently valued at $14 billion.
00:01:46.880 He started it when he was only 19, and at 24, he became the youngest self-made billionaire
00:01:54.520 in the world.
00:01:56.240 We talk about his company, the future of AI, and the role it plays in our human existence.
00:02:01.740 This was super educational for me.
00:02:03.900 I hope it is for you.
00:02:05.040 I'm grateful for his time.
00:02:07.420 Today's guest is Mr. Alexander Wang.
00:02:10.020 Alexander Wang, man.
00:02:28.180 Thanks for hanging out, bro.
00:02:29.480 Yeah, thanks for having me.
00:02:30.460 Yeah, good to see you, dude.
00:02:31.820 Last time was at the inauguration.
00:02:33.860 Yeah, what'd you think of that?
00:02:34.820 Like, what were your thoughts after you left?
00:02:36.440 Um, because you and I ended up, we like left out of there and then went and got lunch together,
00:02:40.540 which was kind of crazy.
00:02:41.980 It was, uh, there was a lot going on.
00:02:43.480 Yeah, it was a, were you there the whole weekend?
00:02:45.780 No, I just got there, uh, the day, the morning of the inauguration.
00:02:49.820 Were you there the whole weekend?
00:02:50.900 I was there the whole weekend.
00:02:51.880 Yeah, no, I mean, I think at least for what we do, like for AI, the, uh, the new administration
00:02:56.900 is really excited about it and wants to make sure that we, we get it right as a country.
00:03:00.580 So, um, that was all great, but it was kind of a crazy, the whole like event, like everything
00:03:05.940 was pretty crazy.
00:03:06.820 I don't know.
00:03:07.020 What'd you think?
00:03:07.880 I mean, when I saw Conor McGregor show up, that's when I was like, shit is, where are
00:03:12.780 we?
00:03:13.460 It felt, that's what like the most bizarre, I mean, you were there, Sam Altman was there.
00:03:18.620 I was just like, it was like, whoa, what happened here?
00:03:22.540 Um, and part of that's because Alex Bruschewitz, you know?
00:03:25.180 Totally.
00:03:25.620 Yeah.
00:03:25.920 I mean, it's all because of him that we all went, but just the fact that he would bring
00:03:29.060 these people together, it kind of makes you question crossover.
00:03:31.840 It's like, uh, it's like in, in those like TV shows when you have those crossover episodes.
00:03:36.080 Yeah.
00:03:36.400 That's what kind of what it felt like.
00:03:37.080 Oh yeah.
00:03:37.580 When like the Harlem Globetrotters and it would be like them versus like, um, the Care Bears
00:03:42.480 or whatever would show up.
00:03:44.500 Exactly.
00:03:44.940 Yeah.
00:03:45.160 Like this.
00:03:45.520 Yeah.
00:03:45.680 This seems, um, yeah.
00:03:47.920 Some cross pollination.
00:03:49.940 Um, yeah.
00:03:50.620 What'd you think when Conor showed up?
00:03:51.860 Was that strange to see him?
00:03:52.940 Like, was there somebody you thought that was strange, like unique to see there for you?
00:03:55.780 Uh, well, yeah.
00:03:57.760 I mean, you, Conor, like the, the, the Pauls, I don't know.
00:04:01.520 The whole, the whole thing was pretty, uh, pretty crazy.
00:04:03.640 I see Sam all the time.
00:04:04.600 And I see, uh, I see some of the tech people all the time.
00:04:07.120 I mean, it was funny crossover and it was obviously like the, so many people flew in to be able
00:04:12.040 to go to like the outdoor inauguration.
00:04:14.900 Right.
00:04:15.200 Yeah.
00:04:15.480 And so, I mean, there were so many people in the city.
00:04:17.720 We ran into them obviously, but like there were so many people in DC, uh, who were just around
00:04:22.460 for the inauguration.
00:04:23.240 So it was, uh.
00:04:24.280 Yeah.
00:04:24.440 I didn't even think about that.
00:04:25.140 Oh, there was a couple hundred thousand people probably that just got kind of displaced in
00:04:27.900 a way.
00:04:28.320 Yeah.
00:04:28.680 And suddenly there's in bars or whatever, just like buying, like there was like people
00:04:33.480 I saw walking just adult onesies and shit that said Trump or on the, that's crazy shit.
00:04:39.760 Yeah.
00:04:40.500 Um, yeah.
00:04:41.720 Thanks for coming in, man.
00:04:42.640 I'm excited to learn about AI.
00:04:44.080 I know that that's a world that you're in.
00:04:45.540 You're in the tech universe.
00:04:47.040 Um, and you, you're from New Mexico, right?
00:04:49.540 Yeah.
00:04:49.800 And so when you grew up there, did you have like a strong sense of like science and technology?
00:04:55.600 Was that like part of your world?
00:04:56.880 Like, did your parents like lead you into that?
00:04:59.140 Like, what was just some of your youth like there?
00:05:01.560 Yeah.
00:05:01.760 So, uh, both my parents are physicists and there's a, I grew up in this town called Los
00:05:07.280 Alamos where there's a, there's a, there's a national lab there.
00:05:11.360 Um, did you watch Oppenheimer?
00:05:12.640 Yeah.
00:05:13.080 Yeah.
00:05:13.380 So the, so all the New Mexico shots in Oppenheimer, that's exactly where I grew up.
00:05:18.000 Oh, damn.
00:05:18.560 So it was like originally where all the scientists came together and, and did all the work in
00:05:23.060 the atomic bomb.
00:05:23.680 And there's still this huge lab.
00:05:25.880 That's basically the, everybody I knew effectively, like their parents worked at the lab or, or some
00:05:31.380 affiliate with the lab.
00:05:32.540 It's like Nukeville over there, huh?
00:05:33.980 Nukeville.
00:05:34.500 Yeah.
00:05:34.960 That's that.
00:05:35.400 Was it scary like that?
00:05:36.580 It was, you know, it, uh, at a level of mystery, like, is your prom like last night under the
00:05:42.100 stars or something?
00:05:42.960 Like there's a, there's this one museum.
00:05:45.300 Well, the funny thing is like, first you hear, you hear this, you learn the story of
00:05:50.300 the atomic bomb and the Manhattan project, like basically every year growing up, because
00:05:54.440 there's always like a day or a few days where, you know, there's a substitute teacher and
00:05:58.700 they just, they just play the videos.
00:06:01.040 So you're just like, yeah, an alcoholic or something.
00:06:03.560 Uh, a recreational user will use that term.
00:06:09.080 That's crazy that that just like every year you guys have like, yeah, just like blast it
00:06:14.220 Thursdays or whatever.
00:06:15.200 And it's just, yeah, you're just learning, you're learning again about the Manhattan project.
00:06:19.600 And then there's a little, there's a little museum in town, uh, which is like you walk
00:06:23.760 through and there's like a, a life-size replica of the nukes.
00:06:27.680 Um, so it's, it's pretty wild.
00:06:30.680 Yeah.
00:06:31.040 And where did they drop the atomic bombs on?
00:06:33.060 They dropped them on Asia, right?
00:06:34.300 Hiroshima.
00:06:35.220 They, yeah, they dropped them, uh, on, uh, in, in Japan.
00:06:39.560 Yeah.
00:06:39.800 Hiroshima and Nagasaki.
00:06:41.100 Is it, is it crazy being like semi-Asia, part Asian?
00:06:44.740 Uh, yeah, yeah.
00:06:45.640 My, uh, my parents are Chinese.
00:06:48.040 Oh, nice man.
00:06:49.080 Is it crazy being Asian and then having that happen to Asian people with the bomb?
00:06:52.960 Like, is that a weird thing there?
00:06:53.960 It's nothing.
00:06:55.000 Uh, no, I don't think so.
00:06:56.220 I mean, I think the thing is like, you know, I, uh, so there weren't, I didn't grow up
00:07:00.400 with very many Asians, uh, cause the, in that town, it was, um, you know, it's, it's in New
00:07:05.700 Mexico.
00:07:06.020 There's very few Asians in New Mexico.
00:07:08.260 So I was one of the only Asian kids in my class growing up.
00:07:12.080 And so, um, I didn't think that much about it, honestly, but then like, uh, but it is super
00:07:17.580 weird.
00:07:17.860 You know, you grow up and you learn about this very advanced technology that had this
00:07:24.320 like really, really big impact on the world.
00:07:26.820 And I think that shaped.
00:07:28.540 Yeah.
00:07:28.700 It's like the scientific John Jones over there.
00:07:30.680 Really?
00:07:30.920 He's a New Mexican, isn't he?
00:07:32.040 He's, he lives in Albuquerque.
00:07:33.420 Yeah.
00:07:33.540 Oh, he does?
00:07:33.960 I think he does.
00:07:34.380 Yeah.
00:07:34.520 Oh, sweet man.
00:07:35.400 Yeah.
00:07:35.740 Yeah.
00:07:35.920 Oh yeah.
00:07:36.280 That, so you, so you're there.
00:07:38.020 There's this, there's this energy always there of this creation.
00:07:42.020 So probably the possibility of creation maybe was always in the air.
00:07:45.080 I'm just wondering like, how did you get formed kind of, you know, like what's your
00:07:48.380 origin story kind of?
00:07:49.940 It was super scientific.
00:07:51.120 Cause, cause you know, there were all of these, there were all these like presentations
00:07:54.900 around what were the new kinds of science that were going on at the lab.
00:07:58.740 So there's all these chemistry experiments and these different like earth science experiments
00:08:03.060 and, uh, physics experiments.
00:08:05.440 Um, and my mom studied like plasma and like how plasma, you know, worked inside of stars and
00:08:13.200 stuff like that.
00:08:13.760 So it was just the wildest stuff and you would talk to people's parent people, you like, I
00:08:19.280 would talk to my classmates or I talked to their parents about what they're working on
00:08:21.960 is always some crazy science thing.
00:08:24.980 So that was, that was really cool.
00:08:28.180 Cause, cause everybody in that town basically is, they're all working on some kind of crazy
00:08:34.220 scientific thing.
00:08:35.400 And so you kind of, I mean, I feel like I grew up feeling like, you know, anything was
00:08:40.440 possible in that way.
00:08:41.620 Like, yeah, cause the rest of us in other communities, shitty communities or whatever,
00:08:45.140 we're just making that volcano or whatever.
00:08:47.040 You know what I'm saying?
00:08:47.440 We're doing like grassroots level bullshit, you know?
00:08:52.680 Um, dang, that's gotta be wild.
00:08:55.740 So every, you see somebody just sneaking behind an alley and buying a bit of uranium and shit
00:09:00.240 like that in your neighborhood, that's gotta be kind of unique.
00:09:02.840 I remember there was someone from our town who did, uh, who did a science fair project
00:09:07.180 called, um, it's called like great balls of plasma.
00:09:10.180 And for the science, literally for the science fair, this was in like high school for the
00:09:13.760 science fair, they made like huge balls of plasma in their garage.
00:09:19.660 And I was like, what the hell?
00:09:21.680 Like this is, we're all just doing this in high school.
00:09:26.080 Damn.
00:09:26.900 So did you feel competitive or did you just feel like hyper capable?
00:09:31.180 Like, did you feel like kind of advanced just in your studies?
00:09:33.480 Like when you were young, like, did you were in class?
00:09:35.620 Did you like, are some of this stuff's kind of coming easy to me?
00:09:38.000 I ended up, what, what I did is, um, there was, uh, I ended up getting really competitive
00:09:43.880 about math in particular.
00:09:46.020 Yeah.
00:09:46.360 And so, uh, so my sport was math, which is kind of crazy.
00:09:51.900 Algebra easy, son.
00:09:53.260 I know I feel you.
00:09:54.500 Um, and it was because when I was in middle school, when I was in sixth grade, there was
00:09:58.500 this one math competition where if you, if you got top four in the state, state of New
00:10:04.500 Mexico, then you would get an all expense paid trip to Disney world.
00:10:08.280 And, uh, and I remember as a sixth grader, like that was the most motivating thing I could
00:10:14.740 possibly imagine was like an all expense paid trip to Disney world.
00:10:18.160 Yeah.
00:10:18.880 Um, and then, uh, did you win it?
00:10:21.720 I got, I won it.
00:10:22.600 I got fourth place.
00:10:23.460 So I snuck in there, uh, snuck in, I, uh, and then I went to, and then we went to Florida
00:10:29.260 to Disney world and I, uh, I hadn't traveled, like I didn't travel around too much growing
00:10:34.360 up.
00:10:34.580 Like I mostly was in New Mexico and we did some road trips around the Southwest.
00:10:37.240 So I remember getting to Florida and it was extremely humid.
00:10:41.820 It was like, I never felt what humidity felt like when I landed in Florida.
00:10:45.980 I was like, Oh, this feels bad.
00:10:48.120 And then I, yeah, it definitely does.
00:10:53.100 Yeah.
00:10:53.620 It's funny.
00:10:54.260 Cause I grew up in humid, dude.
00:10:55.860 Like you would like, you try to shake somebody's hand, you couldn't even land it because it was
00:10:59.800 just not enough of viscose, just too much lube in a handshake.
00:11:03.400 You'd sit there for 10 minutes trying to land a handshake, you know, everybody was
00:11:07.220 always dripping, you know, all the time people, you always thought they were scared or something
00:11:10.700 where they were kind of geeked up off a gas station dope or something, but people were
00:11:14.160 just, it was humid.
00:11:15.500 Yeah.
00:11:15.820 You get really humid over there.
00:11:17.480 So then I became, so then I became a mathlete.
00:11:19.880 Uh, that was like a big part of my identity was being a mathlete.
00:11:23.100 And I would,
00:11:23.500 And you have to wear a vest.
00:11:24.340 Like, what do you guys do?
00:11:24.960 Is there an out, is there a uniform for that?
00:11:26.740 Well, you, you need a, you have a calculator.
00:11:30.660 Okay.
00:11:30.860 So everyone had their favorite calculator.
00:11:33.080 Okay.
00:11:33.400 You got that Draco on you, baby.
00:11:35.000 I feel you keep strapped.
00:11:36.680 You stay strapped at that time.
00:11:39.360 And then, uh, but, uh, but outside of that, not really.
00:11:42.260 I mean, like everyone had their favorite, like his pencil, you know, your pencil, your calculator,
00:11:45.620 that was your gear.
00:11:46.280 And, uh, but yeah, no, I, I was, I was like a, a nationally ranked mathlete for, uh, for
00:11:53.380 middle school, high school.
00:11:55.020 That was my, that was my jam.
00:11:57.080 Yeah.
00:11:57.440 And is it, what's this math circle like?
00:11:59.560 Is there, wow, that's crazy, dude.
00:12:03.240 So you go to, what are there competitions you go to with that?
00:12:05.760 There's competitions.
00:12:06.460 Yeah.
00:12:06.940 There's like, there's like state level competitions.
00:12:09.340 There's national level competitions.
00:12:11.140 There's like these summer camps, math camp.
00:12:14.120 I went to math camp a bunch of times.
00:12:16.400 Um, where you, where you convene with like-minded mathletes.
00:12:19.560 Oh, damn.
00:12:20.100 Okay.
00:12:21.180 Wow.
00:12:21.740 Fucking wizards in the park.
00:12:23.740 Uh, a couple of dudes fucking fighting over a common denominator in the lobby.
00:12:27.860 That's crazy, bro.
00:12:29.420 But then you're, but just like any other sport, like you're like, it's competitive, you know,
00:12:33.620 you gotta like, you gotta win.
00:12:35.380 And, uh, and so, so you're chummy with everyone, but you're also like, we're like, who's going
00:12:40.400 to win, you know?
00:12:41.280 Yeah.
00:12:41.940 No, dude, competition's amazing, man.
00:12:44.000 That's one thing is too, that's so nice.
00:12:45.380 I think about when you're young, it's like, if you're involved in a sport or a group or
00:12:48.740 whatever it was, um, just the, just that chance to be competitive, you know, what are, what
00:12:54.200 were like some of the competitions at the math thing?
00:12:55.840 Like what's a, what's a math competition?
00:12:57.460 Like when you get there?
00:12:58.740 So you, uh, the ones in the middle school, middle school were actually more fun.
00:13:02.920 Cause there's like, there's almost like a Jeopardy style buzzer competition and stuff
00:13:06.760 like that.
00:13:07.720 But, uh, but in high school, once you get to high school, it's just, uh, you just take
00:13:11.980 a test and you just, it's like the, the biggest one in America is called the USEMO.
00:13:17.260 So, uh, bring it up.
00:13:19.600 US MO?
00:13:20.600 US AMO.
00:13:22.260 Okay.
00:13:23.020 And, uh, and it's like, people all around the country take this test.
00:13:28.760 Uh, and there's, uh, United States of American mathematical Olympiad.
00:13:32.900 Okay.
00:13:33.160 Yeah, there you go.
00:13:34.220 Um, okay.
00:13:35.320 A participant must be either a US citizen or a legal resident of the US.
00:13:38.140 Um, okay.
00:13:39.520 Okay.
00:13:39.840 Go on.
00:13:40.460 And then you, uh, and then it's a nine hour test.
00:13:43.560 Damn.
00:13:44.100 It's split over two days.
00:13:45.260 It's nine hours you, it's four and a half hours and it's nine hours and you have six
00:13:50.480 problems.
00:13:51.640 So it's kind of nerve wracking and you get in there, you have four and a half hours the
00:13:55.480 first day, four and a half hours, the second day.
00:13:57.440 Um, can you cheat in between the days?
00:13:59.200 Can you just go home and.
00:14:00.380 No, you, cause you only get three questions the first day and then you only get three
00:14:03.580 questions the next day.
00:14:05.060 And, uh, I remember the first time I took it, I was in, I think I was, uh, I was in eighth
00:14:11.620 grade.
00:14:11.940 First time I took it and I like, I was like so nervous and I just, I like brought a thermos
00:14:17.520 full of coffee and I drank so much coffee that like those four and a half hours felt
00:14:22.480 like, like year.
00:14:23.720 It was like, I was so jittery the whole time.
00:14:26.180 It was, uh, it was crazy.
00:14:27.840 Dang.
00:14:28.180 You out there rattling brother.
00:14:30.140 Oh, integers make me rattle brother.
00:14:32.060 I feel you on that.
00:14:33.700 Dude.
00:14:33.920 So that's pretty, so, so you're competitive and so you get out of there and you're obviously,
00:14:38.580 I guess, um, admired in the world of math probably, or that's, that's like a thing
00:14:42.740 that like, uh, that you can point that that's like a pin that you, a feather in your cap.
00:14:48.340 So that, that helps you get into, um, MIT, right?
00:14:51.700 Yep.
00:14:52.040 Yep.
00:14:52.200 Okay.
00:14:52.620 So then, uh, yeah, in MIT on the MIT admissions, they ask for your like competitive math scores.
00:14:59.920 So if you're, so, so you knew a lot of kids going there probably cause it was tons of kids.
00:15:04.060 Yeah.
00:15:04.320 Yeah.
00:15:04.760 Tons of kids.
00:15:05.320 It was like a reunion.
00:15:06.020 It was like math camp reunion.
00:15:06.960 Damn.
00:15:07.520 Cause I was wondering where all y'all were at, dude, because we were doing some other
00:15:10.840 shit.
00:15:11.400 You know what I'm saying?
00:15:12.300 My, yeah, we were not, God, that's where all the smart kids were.
00:15:16.380 That's fascinating, man.
00:15:18.020 And so then, uh, yeah, I was at MIT and then, uh, MIT is like a really intense school because
00:15:25.240 they, you know, the classes they don't fuck around with.
00:15:28.260 They just really like, they load you up with tons of work.
00:15:31.000 And most of the time you have, you, you like, they load you up with like huge amounts
00:15:34.900 of work and they, you know, you, you, uh, you don't really know what's going on initially.
00:15:38.720 And so you're just kind of like, you're just trying to make it through.
00:15:41.520 But, uh, you know, there's this motto that, uh, that MIT has called, uh, IHTFP, which,
00:15:48.720 uh, among the students stands for, I hate this fucking place.
00:15:51.680 Oh yeah.
00:15:52.680 It's heavy, huh?
00:15:53.800 Yeah.
00:15:54.100 But then, but then the school, when you go, they tell you, oh, it stands for, um, I've truly
00:15:57.960 found paradise.
00:15:58.860 So, oh, so a couple of differences of opinion.
00:16:02.340 Yeah.
00:16:02.720 Yeah.
00:16:03.260 Damn.
00:16:03.620 So it's, so it's really severe there.
00:16:06.140 There's a lot of work.
00:16:07.040 They load you down out of the gate.
00:16:08.600 You do, you do a lot of work, uh, but it's, it's kind of awesome.
00:16:12.060 I mean, cause, cause I think the students really band together.
00:16:14.220 Like you, like, instead of, like, I think, uh, instead of it being competitive, really
00:16:19.200 MIT is much more about like everybody kind of like coming together, working on problems
00:16:24.640 together, working on homework together, and just kind of like making it through together.
00:16:28.760 Um, what's that social life like there?
00:16:30.940 Like, are you dating?
00:16:31.700 Are you going to parties?
00:16:32.400 What's that like for you at that time?
00:16:33.960 It's a bunch of parties because people like MIT, there's a lot of people who like tinkering
00:16:38.360 with gadgets.
00:16:39.380 So they're tinkering with like, you know, lights and big speakers and DJ sets and all
00:16:44.640 this stuff.
00:16:45.000 So actually the parties are pretty good.
00:16:47.220 Cause they're like, the production value is high.
00:16:49.820 The production quality is high.
00:16:50.900 Um, and then what about when the science kids come through the lab dogs, are they bringing
00:16:56.540 out people making like unique drugs?
00:16:58.160 Was there like designer drugs that are being created by actual people?
00:17:01.220 Like just because you know what I'm saying?
00:17:03.700 Like my friends in college, none of us would know how to do that, but there may be somebody
00:17:08.460 at a smart, at a more prestigious school that would know how, was that a thing even?
00:17:12.860 Or is that just, there's one, there was one part of the campus called East Campus, um,
00:17:17.900 where it was like, it was more fringe.
00:17:19.800 And so there was like, at one point in the school year, they would, in their courtyard,
00:17:24.460 they would build a gigantic catapult, like a huge catapult.
00:17:28.240 Like a trebuchet?
00:17:29.460 Like a trebuchet.
00:17:30.380 Yeah.
00:17:31.460 What's the difference?
00:17:32.160 I don't know what the difference.
00:17:32.940 Yeah.
00:17:33.120 Let's get the difference there.
00:17:33.940 Cause people need to know this anyway.
00:17:35.340 People have, for decades now, people have been.
00:17:37.480 A trebuchet has like a rope attached to the end of it that flings it where a catapult just
00:17:42.400 launches it.
00:17:43.380 No, it was a catapult then.
00:17:44.540 Okay.
00:17:45.080 Like a big, cause there was a big cup.
00:17:46.680 There was a big like, like ice cream scooper.
00:17:48.940 Oh yeah.
00:17:49.380 Looking thing that would like, that would fling it.
00:17:51.780 And what are they just flinging Adderall into the rest of the campus?
00:17:54.260 They would fling stuff into the river, which I don't think, now that I'm thinking about
00:17:58.100 it, I feel like, I don't know what the, yeah, no, these giant, this giant like catapult
00:18:02.300 things.
00:18:03.340 Um, yeah.
00:18:05.020 And so this was like a event that would go on and people would kind of rave there.
00:18:08.900 What are you saying?
00:18:09.740 Yeah.
00:18:09.920 They would do this.
00:18:10.860 They would do other things.
00:18:11.920 They were, they were like, they were into, they would like build a lot of stuff over there.
00:18:15.520 Um, and there would be.
00:18:17.120 Like people that ended up at Burning Man later on.
00:18:18.980 Yes.
00:18:19.280 Yes.
00:18:19.520 That was the burning, that was core Burning Man.
00:18:21.700 Like there was a, there was a satanic ritual floor.
00:18:24.320 Oh yeah.
00:18:24.820 Yeah.
00:18:25.060 Like a lot of, a lot of, like it's fringe.
00:18:27.240 Cool.
00:18:27.720 It's cool.
00:18:28.280 Right.
00:18:28.760 Um, but, uh, but there, yeah, so there's all these parties.
00:18:31.440 We, we bragged at MIT that, uh, you know, people from all the neighboring schools, cause
00:18:36.720 Boston is a huge college town, like tons and tons.
00:18:39.180 Boston's an amazing city.
00:18:40.360 Yeah.
00:18:40.740 But no, MIT was fun, but I was only there, I was at MIT for one year.
00:18:43.760 Right.
00:18:43.940 And you dropped out.
00:18:44.900 Is that safe to say?
00:18:45.520 Yep.
00:18:45.820 Okay.
00:18:46.000 You dropped out and then you, so you got into AI, into the AI world.
00:18:51.120 Is that kind of a safe bridge to say?
00:18:52.760 Yeah.
00:18:53.060 Yeah.
00:18:53.260 Yeah.
00:18:53.340 Yeah.
00:18:53.440 Okay.
00:18:53.880 Is there, and I want to ask this cause I know Mark Zuckerberg dropped out of college.
00:18:58.660 You dropped out of college.
00:19:00.000 Both of you guys have had success in tech and kind of, you know, forward thinking that sort
00:19:05.240 of, uh, world.
00:19:07.080 Um, is there something in college that you felt like didn't nurture you or did you just
00:19:12.540 feel like this isn't the place for me?
00:19:15.520 Do you feel like college doesn't nurture a certain type of thinker or was it just a personal
00:19:21.340 choice?
00:19:21.820 I think for me, it was like, um, I was just feeling really impatient.
00:19:28.840 Um, and I don't really know why really, but I remember like, I remember I was in school
00:19:34.340 the first year.
00:19:34.960 I, it was, it was really fun and, and I really enjoyed it.
00:19:37.700 But then I remember, you know, this in the year when I was at MIT was one of the first,
00:19:44.980 like, it was like one of the early big moments in AI because it was, um, I don't even remember
00:19:49.780 this, but there was an AI that beat, uh, the world champion at Go.
00:19:54.640 Um, this was in 2015, which is when I was in college.
00:19:58.760 At Go.
00:19:59.240 And that was a game?
00:20:00.260 Yeah.
00:20:00.500 Go.
00:20:00.780 It's like, uh, it's like a big checkerboard with like white and black stones.
00:20:06.080 It's like, uh, um, and it was, uh, yeah, this, this game AlphaGo versus Lisa Dole.
00:20:13.200 So AlphaGo versus Lisa Dole, also known as the DeepMind Challenge match was a five game
00:20:17.640 Go match between top Go player Lisa Dole and AlphaGo, a computer Go program developed by
00:20:22.440 DeepMind played in Seoul, South Korea between nine and 15, between nine, ninth and 15th of
00:20:29.660 March, 2016.
00:20:30.820 I was confusing how that's written.
00:20:32.020 It is very confusing.
00:20:32.900 Do you think that we got to, um, AlphaGo won all, but the fourth game, all games were won
00:20:39.680 by resignation.
00:20:40.500 The match has been compared with the historic chess match between Deep Blue and Gary, uh,
00:20:45.300 Kasparov.
00:20:46.680 Huh.
00:20:47.200 The winner of the match was slated to win $1 million since AlphaGo won.
00:20:49.700 Google DeepMind stated that the prize would be donated to charities, including UNICEF
00:20:53.620 and USAID.
00:20:55.460 That's just a joke.
00:20:56.600 That's just, um, but Lee received $150,000 for playing.
00:21:00.740 So this was a big moment because this had never kind of happened before.
00:21:05.060 Never happened.
00:21:05.560 Yeah.
00:21:06.020 And it was, it was a big moment for, uh, for AI.
00:21:09.960 It was like, oh wow, this stuff is like, it's really happening.
00:21:12.820 And so then this happened in March and I guess, yeah, I dropped out, started my company in
00:21:17.440 May.
00:21:17.820 So I guess two months after this, I was, I was, uh, yeah, that's what the game looks like.
00:21:23.220 Um, oh, I've, I've played this game before.
00:21:26.340 It's honestly, it's a really, uh, like, I'm not very good at the game.
00:21:31.340 It's a little more fun than playing chess unless you're like, you know, in a like Renaissance
00:21:34.640 fair board games or whatever.
00:21:35.820 Yeah.
00:21:36.120 Yeah.
00:21:36.820 Um, okay.
00:21:37.560 So now we got you, you're loose, dude.
00:21:40.140 You're out of the school and you're in the world.
00:21:43.580 You see, did that match, did realizing that kind of like spurn, you don't want to leave
00:21:48.160 school or did it, was that just something that happened around the same time?
00:21:51.040 It, it kind of did.
00:21:52.520 Basically it was, it was, I remember feeling like, oh wow, AI it's happening.
00:21:57.520 And, uh, this was back in 2016.
00:21:59.440 So like eight, eight, nine years ago.
00:22:01.420 Okay.
00:22:01.700 And then, um, I felt like I had to, you know, basically that, that inspired me to start my
00:22:06.880 company.
00:22:07.840 Um, and I moved, I basically went straight from, I remember this, I was, I flew straight from,
00:22:13.860 uh, Boston to San Francisco and then started the company basically.
00:22:19.300 Um, and that scale AI.
00:22:20.680 Scale AI.
00:22:21.440 Okay.
00:22:21.900 And so did you, where'd you been following AI?
00:22:25.220 Like what are your kind of, are you just like knew like, this is where it's going?
00:22:28.900 Like you just felt there was something, an instinct that you trusted or like, because
00:22:33.580 that's a big thing to do.
00:22:34.960 I was stuck.
00:22:35.900 So I took all the AI classes at MIT.
00:22:38.640 Okay.
00:22:38.860 So you already learning a lot about it.
00:22:40.500 Yup.
00:22:40.760 And then there was one class where, um, you had to like on all the classes you had to do
00:22:45.980 side projects or final projects of some kind.
00:22:48.080 And in one of them, I wanted to build a, um, like a camera inside my refrigerator that
00:22:54.500 would tell me when my roommates were stealing my food.
00:22:57.000 Wang boy.
00:22:57.840 Catching them.
00:23:00.040 Wang boy.
00:23:01.520 Wow.
00:23:01.920 And, uh, but then, so I worked on that and then it was, there was one moment where,
00:23:07.500 where, uh, I was like, there was like, there was a moment that clicked where I was trying
00:23:12.060 to build this thing.
00:23:12.840 And then there was one step that was like too easy.
00:23:15.180 I was like, Whoa, that just worked right there.
00:23:17.280 And then that happened.
00:23:18.940 And then the, the go match happened.
00:23:20.740 And I was like, this, this stuff is happening.
00:23:23.220 And so I, did you ever market those refrigerators?
00:23:25.180 You ever actually create that?
00:23:26.440 I didn't market them.
00:23:27.480 No, I could totally see that, bro.
00:23:29.480 There's a refrigerator.
00:23:30.480 Every dorm has it where there's a camera built in and you just get, you're in, you get a notification
00:23:34.880 on your phone.
00:23:35.880 You know, you're like, damn, Adnan's got my hummus, you know, but you got video of them
00:23:41.620 right there, dude.
00:23:42.380 That's a great idea.
00:23:43.580 Yeah.
00:23:43.800 I love that.
00:23:44.280 That was, uh, that was, that was college me.
00:23:46.500 Yeah.
00:23:47.720 Okay.
00:23:48.120 So Alexander Wang, he's free in the world now.
00:23:50.980 He's headed to San Francisco.
00:23:52.920 He's AI'd up.
00:23:54.000 He feels the energy.
00:23:54.920 He's motivated by some of the classes he took.
00:23:57.920 Um, he's motivated by seeing that AI starting to actually overtake humans, right.
00:24:02.760 Or be able to compete with actual human thinking with the chess match.
00:24:07.280 Yeah.
00:24:07.760 I would, the way I would think about it or the way I thought about the time was like, this
00:24:10.960 is, this is becoming, um, I, you know, people, you know,
00:24:14.260 people have been talking about AI for decades.
00:24:16.520 Like it's kind of been always been one of these things that, um, people have, have said,
00:24:20.860 oh, it's going to happen, but it never really was happening.
00:24:22.680 And it felt like it was, you know, it was really about to happen.
00:24:24.880 About artificial intelligence.
00:24:26.020 Yeah.
00:24:26.160 Yeah.
00:24:26.480 Yeah.
00:24:26.660 Yeah.
00:24:27.220 Cause I've always heard about artificial intelligence.
00:24:28.780 That's what I have.
00:24:30.020 So I've always been like, um, no, you're, you are, you have real intelligence, not artificial,
00:24:35.440 real intelligence.
00:24:35.960 I don't, I mean, I think it's a, it's probably a mix, but I see what you're saying.
00:24:40.100 You know, I do.
00:24:41.420 Do you know what I thought of the other day?
00:24:42.360 It was like, what if they had like a Mexican version?
00:24:44.180 And it was like, Hey, I, I don't know.
00:24:48.620 That's a good, that's a good joke.
00:24:50.380 But thank you.
00:24:51.140 It's a nice laugh.
00:24:51.900 Um, this show is sponsored by better help.
00:24:57.180 What are some of your relationship green flags?
00:25:00.540 That's a good question.
00:25:02.060 What are things you notice in your relationship that are like, yes, this is positive.
00:25:07.960 This is good.
00:25:09.020 We're headed in the right direction.
00:25:10.620 We often hear about the red flags we should avoid, but what if we focused more on looking
00:25:16.880 for green flags in friends and partners?
00:25:19.920 If you're not sure what they look like, therapy can help you identify green flags, actively
00:25:25.200 practice them in your relationships and embody the green flag energy yourself.
00:25:30.540 I've personally benefited from therapy over the years.
00:25:33.820 I think one of the things it's helped me with is just noticing some of my patterns, noticing
00:25:38.620 when I let something small, um, turn into something big and being able to cut it off at the past.
00:25:44.840 So it doesn't happen anymore.
00:25:47.680 Better help can help you.
00:25:50.300 Better help is fully online, making therapy affordable and convenient, serving over 5 million
00:25:56.640 people worldwide.
00:25:58.260 Discover your relationships.
00:25:59.720 Green flags with better help.
00:26:02.140 Visit better help.com slash Theo to get 10% off your first month.
00:26:07.300 That's better help.
00:26:09.020 H E L P.com slash T H E O.
00:26:13.200 Are you struggling to keep your website and digital marketing up to date while running your
00:26:19.160 business?
00:26:19.920 Now you can relax and binge on unlimited web design services with modify.
00:26:26.600 Enjoy unlimited support, a quick turnaround, and your own designated designer.
00:26:34.620 Modify has done my website for years and they're incredible.
00:26:39.340 Sign up with modify today.
00:26:41.480 They make life easier and business better.
00:26:44.560 Visit modify.com slash Theo.
00:26:48.360 That's M O D I P H Y.com slash T H E O for 50% off.
00:26:55.920 The last website you'll ever need.
00:26:59.720 That's modify.com slash Theo.
00:27:04.960 Um, what is AI?
00:27:07.260 Yeah.
00:27:07.880 So, so AI is all about, you know, basically programming computers to be able to start thinking like
00:27:15.320 humans.
00:27:15.700 So, you know, traditional computer programming, uh, you know, it's pretty, it's pretty bare
00:27:22.040 bones.
00:27:22.420 It's not very smart.
00:27:23.260 And so AI is all about, can you have, can you build algorithms that start to be able to
00:27:28.560 think like people and, and replicate some of the, like our brains are these incredible,
00:27:35.680 incredible things, you know, and that's, that's evolution.
00:27:38.740 That's just biology that created our brains.
00:27:41.080 Um, and so it's all about how can we, how can we build something similar to that or replicate
00:27:45.760 it, um, using, using computers and machines.
00:27:48.720 And so the whole, you know, the, the, the whole modern AI, um, era really started around,
00:27:57.080 um, an area called computer vision, but it was like, how can we first get, um, computers
00:28:02.940 to see like, uh, like humans do.
00:28:06.500 So one of the very first, so, you know, one of the very first AI projects was this thing
00:28:12.600 called image net, um, image net and later Alex net.
00:28:16.420 And it was basically, can you get computer programs to like given a photo to tell you
00:28:22.300 what's in that photo?
00:28:23.100 So I see, so just like a human would, like if you showed them this, you're starting to
00:28:26.840 train them to, um, have a perspective.
00:28:30.940 Yeah.
00:28:31.400 Train them to, well, actually originally, like, uh, you could like, let's say you took
00:28:35.440 a photo of this, of this bottle, a machine wouldn't even be able to tell you what's in
00:28:39.020 the photo.
00:28:39.580 It would just know what the pixels were, but it wouldn't be able to tell you like, oh,
00:28:43.540 there's a bottle in that, in that photo.
00:28:45.640 So the, you know, one of the first AI breakthroughs was when YouTube, um, built an algorithm that could
00:28:51.540 tell when there were cats in their videos.
00:28:54.720 And that was like this, you know, in like 2012 or 2011, this was like this mind blowing
00:29:00.080 breakthrough that you could like figure out, you could like use an algorithm to figure out
00:29:04.400 when there was a cat inside, uh, inside a video.
00:29:07.220 And so AI, it started pretty simply with just how do we replicate, you know, um, vision?
00:29:15.000 Like how do you replicate like our, basically the fact that our eyes and our brains can like
00:29:18.720 process all this imagery coming in.
00:29:21.540 And that really led to, I think one of the first major use cases of AI, which is self-driving
00:29:27.420 cars.
00:29:28.420 So this was it, when we, when I started the company in 2016, self-driving cars were all
00:29:34.160 the rage because, um, you know, it was, you know, there were all these like skunkworks
00:29:38.760 projects.
00:29:39.440 When you started scale AI, you mean?
00:29:40.780 Yeah.
00:29:40.980 When I started scale AI.
00:29:41.940 So when you, when you kind of got into AI at that time, self-driving cars are the, are
00:29:45.940 the most popular things.
00:29:47.120 Yeah.
00:29:47.540 Yep.
00:29:47.900 That was all the rage.
00:29:48.920 And so it was, it was all about, can, you know, um, can we start building algorithms
00:29:53.640 that can drive a car like a human would and do it safely and do it, you know, more efficiently.
00:29:58.560 And that way, instead of, that was, that was one of the first major areas.
00:30:03.500 Okay.
00:30:03.820 And then, um, now, you know, I, the whole, the whole industry has moved so fast, but then
00:30:09.700 all of a sudden we got chat GPT and we got, you know, more advanced stuff more recently
00:30:14.300 that, that is able to talk like a human or sort of think like a human.
00:30:17.740 And so it's really come pretty far recently, but all of it is about how do you, you build
00:30:23.700 algorithms, how to use machines to be able to think like a person.
00:30:26.840 Okay.
00:30:27.160 And is it a pro if it's like, say if I opened a door, like, like, oh, we keep the AI in
00:30:32.460 there.
00:30:32.920 Is it a computer?
00:30:34.340 Is it a program?
00:30:35.900 Is it a hard drive?
00:30:37.500 Like what, like, what is that?
00:30:40.920 Yeah.
00:30:41.220 Yeah.
00:30:41.380 So there's two parts, there's two parts to it.
00:30:43.640 So the first part is you need really advanced chips.
00:30:48.840 So you need like, uh, uh, like these, they're called GPUs or sometimes called TPUs or, you
00:30:55.000 know, there's a lot of different words for it, but you need like the most advanced computer
00:30:58.480 chips in the world.
00:30:59.420 Okay.
00:30:59.860 And they, how big is each one?
00:31:01.320 Do you know?
00:31:02.100 Uh, like, or can you measure it like that?
00:31:04.540 They, I mean, the biggest ones are actually like the whole chips are like, you know, pretty
00:31:09.060 big, but they, they're like a, they're like a, like a wafer kind of thing.
00:31:13.020 But then you, um, you put a lot of them all together.
00:31:16.260 Okay.
00:31:16.500 So these, uh, yeah, yeah, yeah, exactly.
00:31:20.460 Um, these like, uh, these chips and the, the biggest ones are, are, yeah, exactly.
00:31:25.580 That's a little one.
00:31:26.400 That's a little one.
00:31:27.380 There's really big ones.
00:31:28.840 Okay.
00:31:29.400 Um, but these are the, so this is the, these are the brain cells of it.
00:31:34.180 Yep.
00:31:34.780 These are, these are the brains behind it.
00:31:36.300 Yeah.
00:31:36.760 So then, um, yeah, exactly.
00:31:38.600 These are some, those are some big ones.
00:31:40.000 So then.
00:31:40.700 So you have to have a lot of these chips.
00:31:41.960 So you need a ton of these chips and that's, that's kind of the, the, like, uh,
00:31:46.780 that's the physical presence.
00:31:48.500 And then, um, and then by the way, they take a huge amount of energy.
00:31:52.280 They're really, uh, they, cause they have, you have to do a lot of, you know, calculations
00:31:56.480 and there's a lot of math that has to happen on them.
00:31:58.560 Um, and so, um, and you have, you have giant, basically data centers, buildings full of
00:32:04.780 like tons and tons of those chips just in giant rows.
00:32:07.480 Like how big are we talking warehouses?
00:32:09.720 Yeah.
00:32:10.060 The biggest one, I mean, like, um, like Elon's data center Colossus is like, uh, I mean, it's
00:32:16.900 probably more than a million.
00:32:18.080 It's definitely more than a million square feet.
00:32:19.440 I mean, it's like just huge.
00:32:20.700 Really?
00:32:21.080 Yeah.
00:32:21.300 Yeah.
00:32:21.440 Look up Colossus.
00:32:23.060 Uh, I've never known this.
00:32:25.000 Yeah.
00:32:25.600 Uh, yeah, yeah.
00:32:27.240 That, that, uh, you see that building with the sunset second row.
00:32:30.720 Yeah.
00:32:30.840 Yeah.
00:32:30.960 There you go.
00:32:32.060 Or, uh, the one to the left.
00:32:33.600 Oh no.
00:32:34.280 Yeah.
00:32:34.660 There you go.
00:32:35.040 That one.
00:32:35.620 Yeah.
00:32:36.080 Look, it's like a huge ass building.
00:32:38.800 It's huge.
00:32:39.380 And all that's just filled with chips.
00:32:41.600 Um, have you ever been in there?
00:32:44.000 I haven't been in that one, but I've, I've been in some of these and it's just, and this
00:32:47.300 is what it looks like inside.
00:32:48.240 Yeah.
00:32:48.400 Basically.
00:32:48.980 Yeah.
00:32:49.560 So it's just rows and rows of chips.
00:32:51.580 So no plants or anything.
00:32:53.300 Yeah.
00:32:53.560 No, no plants, no plants.
00:32:55.140 It gets hot in there.
00:32:56.200 I bet.
00:32:56.740 Um, so the first part is just the, the, that's the physical presence.
00:33:02.020 And then the second part are the algorithms.
00:33:04.900 So then you have like on top of those chips, you have, you have software that that's running
00:33:10.840 and the, the algorithms are just all are like what you actually are telling.
00:33:15.140 What's the math that you're telling to happen on the chips.
00:33:17.540 And those algorithms are, you know, some of the, you know, most complicated algorithms
00:33:22.480 that humans have ever come up with.
00:33:24.660 And that's kind of the, that's kind of the software part, or that's kind of the, that's
00:33:29.140 the part that like, you know, exists on the internet or you can download or whatnot.
00:33:33.000 And then it has to be run on these like huge warehouses of giant, of giant chips.
00:33:37.760 Okay.
00:33:37.960 Um, so when someone goes to like scale AI or chat GPT, these are all AI interfaces or
00:33:45.840 what are they?
00:33:46.520 They're.
00:33:47.200 Yeah.
00:33:47.440 So, so, um, yeah, like chat GPT is a, is a way to be able to, uh, talk to basically you
00:33:54.480 can talk to the algorithm.
00:33:55.580 Okay.
00:33:56.020 So you can start interacting directly with the algorithm.
00:33:58.660 You can see how the algorithm is thinking.
00:34:00.520 Um, so you could say to this, um, can you describe the weather today?
00:34:05.420 Exactly.
00:34:06.000 Yeah.
00:34:06.100 And if you said that to five different AI company or AI companies, basically.
00:34:12.800 Or AI algorithms, different, different AI systems.
00:34:15.380 Yeah.
00:34:15.460 So if you said that's five different AI systems, you might get a little bit of a varied answer.
00:34:18.980 A little bit.
00:34:19.540 Yeah.
00:34:19.880 Okay.
00:34:20.140 You'll get, you'll get, cause they all are trying to have their own style and have their
00:34:23.460 own vibe to it.
00:34:24.820 Interesting.
00:34:25.180 And then what we do, what, what scale AI is all about is we've kind of built the, uh,
00:34:30.860 almost like the Uber of AI.
00:34:32.380 Okay.
00:34:32.740 So a lot of what we're trying to do is how do we, so how do we help produce data that
00:34:38.320 is improving these algorithms?
00:34:40.120 And just like how Uber there's.
00:34:42.220 Okay.
00:34:42.580 You're losing me there a little.
00:34:43.720 Yeah.
00:34:43.980 Yeah.
00:34:44.100 It's okay.
00:34:44.920 But, uh, if I slow down the, if I, if I, if you're losing me, I, but so explain that
00:34:49.000 to me a little bit clearer for me.
00:34:50.720 Yeah.
00:34:50.900 Yeah.
00:34:51.140 So, so, okay.
00:34:52.700 So with these algorithms, um, one key ingredient for these algorithms is data.
00:34:58.360 So, okay.
00:34:59.540 So you have the chips and everything that are, that are storing all the information.
00:35:02.620 Yep.
00:35:03.040 They're storing the data.
00:35:04.020 Yep.
00:35:04.320 And then you have the algorithms that are helping mediate between the user and the data.
00:35:09.120 Yeah.
00:35:09.540 So basically you kind of have, yeah, you have three, you have three key pieces.
00:35:13.660 Okay.
00:35:13.900 So you have the, uh, you have the computational powers of the chips, you have the chips, you
00:35:19.240 have the data, which is like just tons and tons of data that that's where the algorithms
00:35:25.180 are learning the patterns from.
00:35:26.900 Okay.
00:35:27.060 So these algorithms, they aren't just like, they don't just learn to talk randomly.
00:35:30.740 They learn it from learning to talk from how humans talk.
00:35:33.880 Right.
00:35:34.220 Got it.
00:35:34.460 So you need tons and tons of data.
00:35:35.740 And then you have the algorithms, which, uh, learn from all that data.
00:35:40.600 And then they run on top of the chips.
00:35:43.180 Got it.
00:35:43.460 Um, so then one of the big challenges in the industry is, okay, how are you going to produce
00:35:49.800 all this data?
00:35:50.860 And so, um, uh, this is, how are you going to get data for your system?
00:35:56.720 Like how do you farm the best data?
00:35:58.560 How do you, exactly.
00:35:59.280 How do you build, how do you build all that data?
00:36:01.140 And how do you do that, um, in the most effective way?
00:36:04.300 How do you build new data?
00:36:05.820 So clean data, because what if you get a bunch of data in there, that's just a bunch of advertisements
00:36:09.300 and bullshit, will that affect the output?
00:36:11.720 A hundred percent.
00:36:12.340 Yeah.
00:36:12.700 That definitely affects the output.
00:36:14.420 So this whole data, so data is, is, you know, some people say like data is the new oil
00:36:19.540 or data is new gold.
00:36:20.640 Like data is really, really valuable because it's, it's how the algorithms are learning
00:36:26.380 everything that they're learning.
00:36:27.480 Like any, anything that the algorithms know or learn or say, or do all that has to come
00:36:33.700 from the data that goes into it.
00:36:35.540 Okay.
00:36:35.580 So, so if I ask the, uh, if I ask a system, an AI system, a question or ask it to help
00:36:40.500 me with something, help me to design something or to curate an idea, it's going to use the
00:36:46.400 data that it has within it to respond to me and help me, uh, and help give me an answer
00:36:52.480 that I can use.
00:36:53.540 And it's only, and the data it has in it is only based upon the data that, um, is put
00:37:00.020 into it.
00:37:00.620 Exactly.
00:37:01.360 Yeah.
00:37:01.520 So then, so yeah, it's kind of, um, so then we don't, you know, we don't spend enough
00:37:07.660 time talking about where it is, you know, how are you going to get this data and how
00:37:11.360 are you going to keep making new data?
00:37:13.140 So the, the angle that we took at scale was to kind of, um, turn this into an opportunity
00:37:20.140 for people.
00:37:20.720 So, um, we're, you know, we're kind of like the Uber for AI.
00:37:24.160 So just like how Uber, you have, you know, riders and drivers for us, we have, uh, you
00:37:30.780 know, we have the AI systems, you know, the algorithms that need data.
00:37:35.860 And then we have, um, a community of people, a network of people who help produce the data
00:37:41.560 that go into the system.
00:37:42.740 And they get paid to do that.
00:37:43.980 Ah, so they're almost data farming, like creating good data or creating good data.
00:37:48.340 Yeah, exactly.
00:37:49.280 So, and it's, and it's huge.
00:37:51.360 It's like, uh, so we do this through, um, our platform called outlier.
00:37:56.000 Okay.
00:37:56.520 And, uh, uh, outlier last year, uh, people, contributors, we call them contributors on
00:38:02.480 outlier earned about $500 million total across everybody.
00:38:08.000 Um, in the U S that's across 9,000 different towns.
00:38:12.480 Um, and, uh, so it created a lot of jobs, a lot of jobs.
00:38:15.580 Okay.
00:38:15.800 And so what would, um, okay.
00:38:17.960 So scale was your company, scale AI system.
00:38:21.140 Yep.
00:38:21.520 Is that right?
00:38:22.000 So we, uh, yeah, I mean, yeah, scale AI is an AI system.
00:38:26.520 Okay.
00:38:26.900 And then outlier.
00:38:28.520 Yep.
00:38:28.900 Is a separate company that's works with it.
00:38:31.540 Yep.
00:38:32.180 And that is where you are, um, hiring people.
00:38:37.840 We, yeah, we, we basically.
00:38:40.120 To pull in data.
00:38:41.460 Yeah.
00:38:41.880 We, we, we build this platform that anybody, you know, a lot of people all, frankly, all
00:38:47.440 around the world, but Americans too, can log on and, uh, and help build data that goes
00:38:54.420 into the algorithms and get paid to do so.
00:38:57.380 Wow.
00:38:58.060 Yeah.
00:38:58.360 So how, and how does a user do that?
00:39:00.200 Like, what is an example of somebody who's helping build data for an AI database?
00:39:04.640 Yeah.
00:39:05.140 Let's say you're a nurse, like you're a nurse with like tons of tons of experience.
00:39:08.660 So, you know, a lot about how to take care of people, um, and, uh, take care of, of people
00:39:14.160 who are sick or, you know, have issues and whatnot.
00:39:16.100 And, uh, so you, you could log onto the system and, and this, and our platform, and you could
00:39:22.480 see that the algorithm is, you know, let's say you ask the algorithm, like, Hey, I have
00:39:27.060 a, you know, I have a pain in my, in my stomach.
00:39:30.140 What should I do?
00:39:30.960 And you notice that the algorithm says the wrong thing.
00:39:33.440 Like the algorithm says, Oh, just, you know, hang out and, and, you know, it'll go, it'll
00:39:37.160 go away.
00:39:37.780 And you know, as a nurse, like that's wrong.
00:39:40.660 Like I, you know, you have to, you have to go to the emergency room because you might
00:39:44.460 have appendicitis or you might have, you know, you might have something really bad.
00:39:47.620 And so you would, as a nurse, you would go in and you would basically correct all these
00:39:51.960 mistakes that the AI system has.
00:39:54.120 And then that would then feed it back into the algorithm.
00:39:56.860 So that's smarter the next time.
00:39:58.200 So it's kind of this, this continual process of, and there's versions of that for whatever
00:40:03.820 your expertise is or whatever, you know, you know more about than anything, everything,
00:40:08.460 everything.
00:40:09.040 Exactly.
00:40:09.760 So, and so people get paid for that.
00:40:12.160 Yeah.
00:40:12.380 Yeah.
00:40:12.500 They get paid.
00:40:13.320 And how do you know if their information is valuable or not?
00:40:16.580 Uh, well we, so we don't want spam obviously.
00:40:19.860 So we, we, we have a lot of systems to make sure that people aren't spamming and that like
00:40:24.280 you're saying it's not, it's not, you know, it's not garbage in that's going into the,
00:40:28.320 into the algorithms.
00:40:29.500 Um, so we have, you know, we have kind of like people check the work of other people to
00:40:33.600 make sure that the AI systems are really good.
00:40:35.880 And we have some like automated systems that check this stuff.
00:40:38.800 But, uh, but for the most part, it's, it's like, it's really broad.
00:40:42.160 Like we want experts in anything, everything, shellfish, train tracks, whatever.
00:40:47.820 Yeah.
00:40:48.360 Gelatin, everything, childhood, death or whatever.
00:40:52.400 Yeah, totally.
00:40:53.280 Stars, galaxies, whatever.
00:40:55.220 Animals, big animals.
00:40:57.340 Yeah.
00:40:57.760 Yeah.
00:40:57.940 Damn.
00:40:58.780 So, wow.
00:41:00.240 So it's kind of like your data is almost like an ocean or a body of water and you, different
00:41:05.300 places are going to be able to keep their body of water cleaner or dirtier and different
00:41:09.660 infections could get in different spyware, all types of stuff.
00:41:12.480 So, and if you have the really a clean body of water, then you're going to be able to
00:41:16.520 offer a clean data or a certain type of data to, um, people who are using your AI platform.
00:41:24.820 Does that make sense or not?
00:41:26.360 Yeah, yeah, totally.
00:41:27.220 And our job is like, how do we make sure that this body of water is as clean as possible
00:41:31.680 and we fill it up as much as possible, that it has as much information about everything
00:41:36.720 across the globe.
00:41:38.100 Wow.
00:41:38.180 So is there almost a race for information right now in a weird way or no?
00:41:42.020 Is that not it?
00:41:43.060 There a little bit.
00:41:44.160 Yeah.
00:41:44.340 I think that there's a, well, there's a race for like, how are different AI systems competing
00:41:48.640 against each other?
00:41:49.380 And sorry to interrupt you.
00:41:50.520 Yeah, no, no.
00:41:51.240 So the, there's a, there's, it's, it goes back to the three things I mentioned.
00:41:54.980 So there's kind of like three dimensions that, that they're all competing as one another.
00:41:59.440 One is, um, chips.
00:42:01.620 So who could, who has the most advanced chips?
00:42:04.040 Who is the, the biggest buildings of chips?
00:42:06.480 Like who has the most chips, um, that they're utilizing, uh, data.
00:42:10.360 So the kind of body of water, whose body of water is better, cleaner, you know, uh, healthiest,
00:42:15.480 biggest, et cetera.
00:42:16.480 And then the last is algorithms.
00:42:18.280 So who's, and this is where the scientists really come in.
00:42:21.540 And it's like, okay, who's coming up with the cleverest algorithms or who has like a trick
00:42:26.960 on an algorithm that somebody else doesn't have.
00:42:28.940 Like who's doing that to basically, um, make the AI learn better off of the data that it
00:42:34.480 has.
00:42:35.540 Wow.
00:42:36.700 God, I'm in the future right now.
00:42:38.500 That's so, man, it's just, it's so crazy.
00:42:43.160 And I think AI scares people because the future scares people, right?
00:42:46.340 It's like, that's one of the scariest things sometimes is the future.
00:42:49.320 So I think a lot of times people, you associate, um, cause a lot of times when people mention
00:42:53.860 AI, there's a little bit of fear.
00:42:55.160 It seems like from people, um, there's fear that it's going to take jobs.
00:42:58.580 There's fear that it's going to take over, um, the, our ability to think for ourselves.
00:43:03.400 Yeah.
00:43:03.720 There's just kind of some general uncertainty there.
00:43:06.060 Like is this, and it kind of feels like fear a lot of times, but a lot of times fear is
00:43:10.280 just a lack of knowledge.
00:43:12.120 Right.
00:43:13.100 Um, and not a lack of knowledge because you didn't want to know just cause you don't know
00:43:17.220 or that you're dumb, but just cause you don't know.
00:43:20.220 Um, what are, what are positive things that we're going to see with AI?
00:43:24.360 Right.
00:43:24.620 I want to start there.
00:43:25.640 Yeah.
00:43:26.080 So I think first, like we don't, the AI industry, we don't do the best job explaining this.
00:43:31.920 And I think sometimes we make it seem all sci-fi and, and, and genuinely we were part
00:43:37.800 of the problem and making it seem scary.
00:43:39.620 Right.
00:43:40.100 Uh, but you know, one thing for example, is like, I think AI is actually going to create
00:43:45.760 a ton of jobs.
00:43:47.340 Um, and that story is not told enough, but you know, these jobs that we're producing or
00:43:52.680 this, this sort of, um, this opportunity that we're providing on our platform outlier,
00:43:56.720 like that's only going to grow as AI grows.
00:43:59.660 So.
00:43:59.880 Because you have to have new data.
00:44:01.080 You have to have new data.
00:44:02.400 And the only place you can get new data is from people will, will at a certain point
00:44:06.160 with the system, be able to create, it can probably matriculate data or matriculate.
00:44:11.400 Is that with birthing or what is that?
00:44:13.220 Yeah.
00:44:13.400 It just moves through like.
00:44:15.080 Okay.
00:44:15.540 It can probably like quantify or, and, and, and give you answers, but it can AI create new
00:44:21.820 data.
00:44:22.860 Uh, no.
00:44:24.040 So, so I think, well, it can do a little bit of that.
00:44:26.700 So it can, AI can help itself create its own data, um, and, and help itself a little
00:44:32.260 bit.
00:44:32.500 But ultimately most of the progress is going to come from, you know, people who are able
00:44:38.540 to help really the model get better and smarter and more capable at all these, all these different
00:44:43.000 areas.
00:44:43.280 Ah, yeah.
00:44:44.660 I didn't see that part of it.
00:44:45.980 I didn't understand that we are the ones who are giving it, um, information.
00:44:51.620 And since we're going to continue to learn, I would assume that we would be able to help
00:44:55.320 continue to help it learn.
00:44:57.640 Yeah.
00:44:57.780 And the world's going to keep changing and we're going to need to be able to keep teaching
00:45:01.480 the, the algorithms, keep teaching the models about how the world's changing.
00:45:05.000 So, you know, uh, this is actually a, a big thing that I think most people don't understand.
00:45:11.360 The people who are getting the opportunity now who are earning money from it see it, but
00:45:15.720 as AI grows, there's actually going to be tons of jobs created along the way and tons of
00:45:22.420 opportunity for people to help improve AI systems or control AI systems or overall, um,
00:45:29.520 sort of be a part of the, the technology, not just sort of disenfranchised by it.
00:45:34.200 Okay.
00:45:34.960 So like, yeah.
00:45:35.840 So what were, what are you, do you feel like are other ways?
00:45:38.140 Like if you had to look into the future a little bit, right?
00:45:40.300 So you have the fact that people are going to be able to add more data, right?
00:45:43.800 Yep.
00:45:44.000 And add nuances to data.
00:45:45.680 Yep.
00:45:46.120 Right.
00:45:46.380 And, um, and probably humanize data a little bit.
00:45:49.360 Yeah, totally.
00:45:50.340 Um, and then you're also going to have what?
00:45:54.540 I think you're going to have a lot of, um, a lot of jobs around, you know, as AI starts,
00:46:01.620 um, doing all these little things throughout the world, who's going to keep watch of those
00:46:07.540 AI and who's going to make sure that those AI are, uh, aren't doing something that we
00:46:11.560 don't want them to do.
00:46:12.540 So almost like managing the AIs and, and keeping watch over all the AI systems, that's going
00:46:18.160 to be another thing that we're going to have to do.
00:46:19.760 Um, and then, and then it's just somebody to kind of guide the river a little bit.
00:46:24.840 Yeah.
00:46:25.060 At certain point, guide the stream, stay in there, uh, watch, make sure that answers are
00:46:30.080 correct.
00:46:30.640 Make sure the information, um, is honest.
00:46:34.620 Yeah.
00:46:35.200 Yeah.
00:46:35.580 Like, like I think for example, um, you know, we're not going to just have AIs going around
00:46:41.780 and, you know, um, you know, buying stuff and doing crazy things.
00:46:47.520 And like, you know, we're going to, we're going to keep it controlled, right?
00:46:50.760 Like as a society, I think we're going to keep it controlled as a technology.
00:46:54.200 And I think there's going to be a lot of jobs for people to make sure that the AI doesn't
00:46:58.960 go out and do crazy things that we don't want it to do.
00:47:01.720 Right.
00:47:02.120 So we want to be, so you're going to need managers.
00:47:03.560 You're going to need facilitators.
00:47:04.600 Yeah, exactly.
00:47:05.540 What are things that AI will alleviate?
00:47:07.660 Like, what are things that like, will it eventually be able to have enough information
00:47:12.760 like our data where, uh, where it can like cure diseases and stuff like that?
00:47:18.240 Like, is that a realistic thing?
00:47:19.780 Yeah.
00:47:20.060 That's super real.
00:47:21.400 Um, like cancer even.
00:47:23.380 Yeah.
00:47:23.780 Cancer.
00:47:24.700 Um, yeah.
00:47:25.580 Heart disease, like all these, all these diseases.
00:47:28.880 Cancer on its heels, but, but sorry, Antonio Brown, Antonio Brown was just here.
00:47:33.440 I think there's still some smoke in the air.
00:47:35.260 Um, no, but seriously, I think that AI, uh, one thing that we've seen, which is, this is
00:47:42.000 kind of wild, but AI understands like molecules and biology better than humans do actually,
00:47:49.940 because, um, it's, uh, like, like, uh, there, there's this, there's this thing in AI where,
00:47:56.860 you know, it used to take a, like a PhD biologist, like, you know, five years to do something that
00:48:04.780 the AI can, can just do in, you know, a few minutes.
00:48:08.140 Right.
00:48:08.580 And that's because the, like, uh, just the way that molecules and biology and all of
00:48:14.520 that works is something that, that AI happens to be really good at.
00:48:17.680 And that's going to help us ultimately cure diseases, find, um, you know, pharmaceuticals
00:48:24.060 or other treatments for these diseases and ultimately help humans live longer.
00:48:27.940 Cause that's very data driven, right?
00:48:29.880 Like it's very specific.
00:48:31.240 It's very mathematic.
00:48:32.380 Exactly.
00:48:33.120 Yeah.
00:48:33.600 Yeah.
00:48:34.440 Um, so it's going to be a huge tool for us to, to cure disease, um, for us to help educate
00:48:40.200 people, um, for us to, you know, there's a lot of, a lot of really exciting uses for AI,
00:48:45.360 but I think the kind of, I think the thing that, um, will touch most people in their lives
00:48:51.220 is it's really going to be a, um, like a tool that'll help you, um, you know, make all of
00:48:58.560 your sort of, uh, make all your dreams kind of become reality, if that makes sense.
00:49:03.400 So, so I think one of the things that AI is going to be really awesome for is like, you
00:49:08.980 know, today, if I, I have like a million ideas, right.
00:49:12.680 I have like, you know, thousands and thousands of ideas and I only have so much time.
00:49:16.640 So we can only really do, you know, a few of them at a time.
00:49:20.200 And most of the ideas just go to die.
00:49:22.380 Right.
00:49:22.940 Yeah.
00:49:23.300 They do, huh?
00:49:24.080 Yeah.
00:49:24.280 That's a bummer.
00:49:25.140 It's a huge bummer.
00:49:26.180 Yeah.
00:49:26.420 And I think a lot of people, you know, for whatever reason, they may have some of the
00:49:30.400 best ideas ever, but they just, you know, they're too busy or they have like other shit going
00:49:34.960 on in their lives.
00:49:35.500 They can't make those ideas happen.
00:49:37.660 And it's great with sometimes when people are able to make the leaps and make them happen
00:49:41.160 and like devote themselves to their dreams.
00:49:42.720 But, um, but that doesn't happen enough today.
00:49:45.280 And one of the things that AI is going to help us do, I, I legitimately think so, is
00:49:49.900 it's going to help us, um, turn these ideas into reality much more easily.
00:49:54.880 So, you know, you can, um, like, you know, you're making a movie, let's say you have another
00:49:59.920 movie idea.
00:50:00.740 You can say, you can ultimately, I think you'll be able to tell an AI, Hey, I have this idea
00:50:05.000 for a movie.
00:50:05.840 What could that look like?
00:50:07.400 You know, maybe draft up a script.
00:50:09.580 Um, also who are the people who can help, you know, fund this idea?
00:50:13.360 Like who, who those people be can help reach out to them.
00:50:16.100 And then like, you know, who should we cast in it?
00:50:18.480 Basically help make the whole thing a, uh, you know, instead of those like daunting thing
00:50:23.540 that these big projects, they're usually so daunting.
00:50:26.900 You really don't know where to get started.
00:50:28.160 You kind of need a person to help you get through them instead of that AI will help you
00:50:31.940 get through it and like help do a lot of the, the sort of less glamorous work to make them
00:50:36.020 a reality.
00:50:36.720 Wow.
00:50:37.600 So I could say, for example, like, um, like AI, I would like to shoot maybe, I'm thinking
00:50:43.140 about creating an idea or shooting a film in this area or, or it's like this, it's going
00:50:48.240 to take place in this type of place.
00:50:49.560 I can give it the setting.
00:50:50.460 I can give it like a, a, a outline of the characters, like what they look like their ages
00:50:55.580 and some description.
00:50:56.360 Could you help give me, um, possible potential actors or something within a certain price
00:51:01.840 range that I can maybe cast for that?
00:51:03.780 Um, yeah.
00:51:05.060 Could you help give me like locations around the country that would fit that backdrop?
00:51:09.940 Yep.
00:51:10.140 Um, could you, uh, list me all the talent agencies that I could reach out to and you could kind
00:51:17.260 of just put those things in and then you would have sort of a, uh, a bit of a guidebook
00:51:23.260 at that point that would make your, what before was something that felt extremely daunting
00:51:28.260 shit in two minutes, you know, and then you put it in the AI, it gives you the information
00:51:32.780 back in a few minutes and maybe you're.
00:51:34.680 And I think, I think over time, it'll also be able to start doing a lot of the legwork
00:51:38.080 for you.
00:51:38.580 So it'll be able to reach out to people for you.
00:51:40.740 It'll be able to, you know, uh, figure out the logistics.
00:51:43.640 It'll be able to, to book things for you.
00:51:45.800 Like it'll be able to basically help do all the legwork to make it make, you know, whatever
00:51:50.220 it is into a reality.
00:51:51.420 Um, so very much an assistant in a lot of ways.
00:51:53.920 Yeah, yeah, yeah.
00:51:54.900 An assistant co-pilot.
00:51:56.880 Um, you know, the, the hot word in the AI world is agents, but you know, it's just,
00:52:01.940 it, it'll be something that'll help you, you know, you humans are going to be in control.
00:52:06.000 Humans are ultimately going to be telling the AIs what they want it to do.
00:52:09.580 And then, you know, Hey, what do we want?
00:52:12.140 What do we want these AIs to do?
00:52:13.020 It's ultimately going to be to like help us execute on and accomplish all of these ideas
00:52:17.980 that we have.
00:52:18.500 Um, so a genius could be three or four X if somebody's like a genius in something in
00:52:23.660 some realm or space of thought, you could three or four X them because totally like
00:52:28.820 multiply their, their output from their own brain because they could have something really
00:52:34.520 helping them, um, get done a lot of the, like the early work on things and maybe some
00:52:39.940 of the most severe work.
00:52:41.020 Yeah, totally.
00:52:41.940 Like what, I think one thing that I always feel like kind of sucks is that, um, if you
00:52:46.620 have a director you really like, they're only going to make a movie once every couple of
00:52:50.120 years.
00:52:50.360 So even if you have a director that you'd like think is amazing, you know, they just,
00:52:53.960 it's hard for them to make that many movies like, cause it just takes so much time and
00:52:57.860 effort and you know, um, there's so many bottlenecks and stuff.
00:53:01.320 So in a future with, you know, more advanced AI systems, they could, they could just churn
00:53:07.500 them out and, uh, they could, they can make so many of their ideas into a reality.
00:53:11.700 And I think that's true not only in creative areas, but it's kind of true across the board.
00:53:15.500 Like, you know, you can start new businesses more easily.
00:53:18.140 You can, um, you know, you can make various creative projects that you have happen more
00:53:22.320 easily.
00:53:22.620 You can make, like, you can, you can finally plan that event that you and your friends
00:53:26.720 have been talking about for like, you know, years and years.
00:53:29.600 So it can just like really help you, you know, we think about as like giving humans more agency,
00:53:35.200 giving humans more sort of sovereignty and just, and just enabling humans to get way more
00:53:39.720 done.
00:53:41.580 Yeah.
00:53:41.980 That's a great way.
00:53:43.040 I like some of this thought because yeah, I could be like, like my fantasy football group
00:53:47.160 and I, we do a, um, we do a, uh, draft every year in a different location, you know,
00:53:52.100 and shout out PAE.
00:53:53.500 That's our fantasy league.
00:53:55.080 J rod, everybody that's in it for the past 17 years, we've flown to a city each year
00:53:58.960 and done a draft in person.
00:54:00.440 Right.
00:54:01.600 Um, but I could say, Hey, uh, we have 10 guys.
00:54:05.960 We want to go to a nice place.
00:54:06.960 We want there to be a beach.
00:54:08.380 Um, this is the age group of our group.
00:54:11.300 Uh, you know, these would kind of be the nights and it, you know, just to really give
00:54:14.700 me like a, just a nice plan of, Hey, here's 10 possibilities.
00:54:17.820 Right.
00:54:18.100 Something like that.
00:54:18.780 And then even more like with a movie, this is what I worry you'd, you'd run into.
00:54:23.360 Say if I'd be like, okay, I have two main characters and this is kind of what I would
00:54:27.040 like to happen.
00:54:27.840 Could you help me with a first act of a three act movie?
00:54:32.960 How do you know?
00:54:33.760 Everybody just doesn't get the same movie then.
00:54:36.340 Like, that's what I would start to worry that everything that you have to create is
00:54:39.480 all going to be very similar.
00:54:41.320 Yeah.
00:54:41.520 So then I think this is where it comes into like, this is where human creativity is going
00:54:46.120 to matter because it's going to be about, then it's about like, okay, what is the, how
00:54:51.000 am I, you know, how am I directing the AI system?
00:54:54.180 Like, what are my, what are the tricks I have to make the AI give me something that's different
00:54:58.440 and unique?
00:54:59.100 And that's, that's not different at all from, you know, how creative stuff works today.
00:55:03.560 Like even on social media or anywhere, you know, this, like you still add your own spice
00:55:07.440 to it.
00:55:07.860 You need, yeah, you need an angle.
00:55:09.200 You always need to like have something that's going to make it like different and interesting
00:55:13.820 anew.
00:55:14.940 So that's not going to change.
00:55:16.100 Like we're, humans are always going to have to figure out based on where culture's at,
00:55:19.600 based on where, you know, what the dialogue is, what the, what the discourse is, all that
00:55:24.280 kind of stuff.
00:55:24.840 What is my fresh take?
00:55:26.620 That's where, that's really one of the key things that humans are going to have to keep
00:55:30.200 doing no matter what.
00:55:31.400 Right.
00:55:31.720 And, and, and some of the, like a lot of films and books, a lot of it is, there's just like
00:55:37.200 a problem.
00:55:38.120 There's a, there's maybe a information you learn.
00:55:42.240 Then there's a red herring and then there's a solution.
00:55:44.920 Like that's a lot of stories, right?
00:55:46.680 So if something just gave you the basis and then you go through and make everything your
00:55:51.980 own, um, cause a lot of things we don't, there's only so many templates for things.
00:55:57.040 Um, huh?
00:55:59.060 Yeah.
00:55:59.440 So, so say for example, say you might need to hire people at a company then that would
00:56:04.640 help direct your AI, like somebody who's good at managing AI, uh, and giving it the
00:56:11.840 best prompts or the best way to ask it questions, uh, to get the perfect feedback for your company.
00:56:18.820 Yeah.
00:56:19.320 So those would be new jobs.
00:56:20.800 You, those would be actual new people you would need.
00:56:23.140 Yeah.
00:56:23.600 So, so tons of new jobs, right?
00:56:25.340 Well, first I think just like helping to, you know, kind of what we were talking about
00:56:29.760 before these jobs around helping to improve the data and, and, um, contribute to the AI
00:56:36.360 systems.
00:56:37.160 That's just going to keep growing for a long, long time.
00:56:39.620 And then as AI gets better and it gets used in more areas, then you're, there are going
00:56:44.220 to be a lot of jobs, uh, that pop up just exactly as you're saying, which is how do you,
00:56:49.780 what's the best way to use this as a tool?
00:56:51.520 What's the best way to leverage it to, you know, um, actually make some of these, uh,
00:56:57.720 these applications or, you know, whatever you want to build a reality.
00:57:01.580 And then, and then there's going to be folks who need to, once those are built, like, how
00:57:05.800 do you keep improving it?
00:57:06.700 How do you keep making it better?
00:57:07.720 How do you keep, um, how do you keep it fresh?
00:57:10.420 How do you keep it, um, keep it going?
00:57:13.220 And then how do you also make sure it doesn't do anything bad, right?
00:57:15.940 Like how do you make sure that the AI doesn't accidentally spam a million people or whatever
00:57:20.260 it might be, um, and you make sure that it sort of is, uh, like operating in a good way.
00:57:25.980 Fahim Anwar, I was watching him bring up a picture of Fahim.
00:57:29.360 This is one of the funny, this guy, this is the most creative comedian, uh, in America.
00:57:37.740 Undeniable.
00:57:39.040 Um, he is so funny.
00:57:41.180 He had any, everybody would say he's, he's one of the few comedians that everybody goes
00:57:45.720 in there to watch him perform.
00:57:46.760 Um, he had a bit the other night he talks about, he got into a Waymo, right?
00:57:50.880 Uh, um, a car.
00:57:52.620 And I see so many Waymos now, which are cars that are just, nobody's in them, you know,
00:57:57.000 but they're going right.
00:57:58.240 So he had this bit, he's like, he got into a Waymo and it started complaining about its
00:58:02.940 life to him.
00:58:03.500 He's like, I've had a shitty week.
00:58:05.200 Like the car's just talking to him.
00:58:06.460 And he's like, what the, and he's like, now I, no matter what, I still have to talk to
00:58:10.280 my shitty driver.
00:58:11.480 That's what he was saying.
00:58:12.200 Um, if you get a chance to see him though, that guy is, he's fascinating.
00:58:16.880 Um, but like, so what, what companies right now should kind of look to hire an AI guy?
00:58:22.260 Cause we've been thinking about it.
00:58:23.460 Like we had some, uh, we had Cat Williams on a last week and we're like, Hey, can you
00:58:28.940 create visuals of Suge Knight and Cat Williams riding bicycles down Sunset Boulevard?
00:58:33.080 Right.
00:58:33.600 And this is the one they sent back a little while later.
00:58:38.060 Christopher Folino is the guy's name.
00:58:39.860 FYI.
00:58:40.260 And this is just, this was right where I say it was like by the comedy store on Sunset
00:58:45.180 Boulevard.
00:58:45.880 There you go.
00:58:47.180 I mean, this looks like it's out of a movie kind of.
00:58:49.080 Yeah, totally.
00:58:50.080 I mean, and the guy did that in a little, in just a little bit of time.
00:58:52.880 What types of people would you get to hire?
00:58:54.840 I mean, this is.
00:58:55.640 That's you.
00:58:56.380 Yeah.
00:58:57.880 If I was healthier, if I had healthier gums too.
00:59:00.660 But what about this?
00:59:01.620 What kind of like what companies right now, what job spaces?
00:59:05.100 Cause we want to get an AI guy.
00:59:06.480 Right.
00:59:06.760 But I don't really, my brain is like, well, what do they do?
00:59:09.460 How do I, how would I keep them busy?
00:59:12.620 You know, I could get them to make some animations and ideas, but what type of people need an
00:59:17.660 AI person right now?
00:59:19.080 Do you feel like?
00:59:20.340 I kind of think about AI.
00:59:21.940 It's kind of like the internet where, you know, eventually everybody's going to need to figure
00:59:27.620 out how to, how to utilize it, how to best, how to best use it for their industry or whatever
00:59:32.460 they do, how to make it then more efficient.
00:59:34.720 Like it's something that I think everybody is going to, is going to need to adopt at some
00:59:40.440 point.
00:59:40.740 So, um, you know, might as well start earlier because eventually just like how, you know,
00:59:46.660 every, basically every company has to figure out how to use the internet well, and how to be
00:59:50.560 smart about, you know, the internet and digital stuff.
00:59:53.420 Every company is going to have to be smart about AI, how to use AI, how to make, um, how to have a
00:59:58.960 unique twist on it so that, you know, their stuff stands out relative to other people's.
01:00:02.860 That's going to be really important.
01:00:03.980 But, um, so we see, I mean, in our work, you know, we work with all these, um, all these big
01:00:10.240 companies in America and we see it everywhere from, you know, uh, we worked at Time magazine
01:00:15.440 on some stuff and then we worked with, uh, Toyota on some stuff for their cars.
01:00:19.980 And we worked with, um, you know, large pharmaceutical companies for the biology stuff we were talking
01:00:24.540 about, large hospitals for, you know, helping to treat patients.
01:00:27.780 Like really it's across the board.
01:00:30.140 Um, and I think that goes for, you know, obviously these like really big businesses,
01:00:34.480 but also for, for smaller businesses, you know, there's always interesting ways to utilize
01:00:38.620 it to, to, uh, to provide a better product or a better experience or better content for,
01:00:44.500 you know, whoever you need to do that for.
01:00:46.780 Yeah.
01:00:47.140 Cause I guess right now we're like, there's certain moments like, Hey, well, let's animate
01:00:49.540 this moment or see what AI would look like.
01:00:51.140 It adds some visual effects to some of our episodes.
01:00:53.940 So that's something we'd like to do to just be fun and creative.
01:00:57.140 I would like to maybe create like some sort of an animated character.
01:01:00.400 We already have a great animator and we want to keep that, but to have an AI space where
01:01:04.040 it's like, you know, cause they have something, they had a little cat the other day or something
01:01:07.500 and he was going to war and I was like, damn dude, this is, and it was AI, you know, totally.
01:01:12.860 And they had a baby who was getting in a taxi and I was like, this shit is elite, you know,
01:01:16.420 I don't know if it's illegal or not, but it seems, you know, it doesn't seem, you know,
01:01:20.780 it's definitely a tool for storytellers, right?
01:01:23.140 Like it's right.
01:01:23.940 It's, it'll help people with a creative vision or, or war cats.
01:01:29.400 Yeah.
01:01:29.600 That's a story right there.
01:01:30.940 Special forces cats.
01:01:32.700 Would YouTube recognize this?
01:01:36.020 Damn brother.
01:01:36.900 If they show up, dude.
01:01:39.080 Wow.
01:01:39.820 This honestly, it looks, it looks bad-ass and it looks like cats have been waiting to do
01:01:43.840 this a long time.
01:01:44.920 Yeah.
01:01:45.480 That's the crazy shit about cats.
01:01:46.880 You look in their eyes.
01:01:47.880 Now their outfits finally match the look in their eyes.
01:01:50.480 That's what it feels like, dude.
01:01:53.300 Wow.
01:01:53.900 That's dude.
01:01:54.640 That's, uh, that's Fremen, Fremen cats.
01:01:58.740 Oh my gosh.
01:02:00.240 Yeah.
01:02:00.540 Yeah.
01:02:00.800 So that's going to get alarming, dude.
01:02:03.560 That's going to get hella alarming.
01:02:05.400 That's a movie right there.
01:02:06.420 It is.
01:02:06.900 But it's almost like you could make, like, that's what I want to get.
01:02:09.540 I want to get somebody to help us think, Hey, help make these little segments that we
01:02:13.960 can take our creativity.
01:02:15.080 And instead of me thinking, man, I got to write this huge, crazy script for just a kind
01:02:18.980 of small things, you know, little moments.
01:02:21.080 Yeah.
01:02:21.400 Can you help me make this happen?
01:02:23.020 I think that's actually the key thing, which is AI, like AI will just be something that
01:02:28.380 we turn to, to help make our dreams happen.
01:02:31.400 Like we help make our ideas and our dreams and our, you know, whatever we want to do
01:02:34.940 happen more easily.
01:02:36.200 Got it.
01:02:36.580 That's like at the core of what it'll be.
01:02:39.020 Yeah.
01:02:39.220 Um, who's the, who's the current leader in AI development?
01:02:44.160 Like, is it America?
01:02:45.680 Is it China?
01:02:47.000 Is it Israel?
01:02:48.200 Is it trying to think of another, uh, superpower, uh, Russia maybe?
01:02:53.360 Um, or yeah, who is it?
01:02:54.640 Taiwan I know has a lot of the chips and they manufacture a lot of the chips over there.
01:02:59.380 Um, and does it matter what country leads in AI or does it just matter the company like
01:03:07.080 scale or another AI company?
01:03:10.000 So does that make sense?
01:03:11.800 That question?
01:03:12.360 No, totally.
01:03:12.940 Totally.
01:03:13.460 Yeah.
01:03:13.680 So, so today America is in the lead, but, um, but China as a country is, is sort of hot
01:03:20.100 on our tails.
01:03:21.120 Um, like there was, uh, there was all that news about deep seek a couple of weeks ago
01:03:25.220 and, uh, deep seek in still in most places around the world is the number one most downloaded
01:03:30.680 app.
01:03:31.180 You know, it's downloaded a ton and everywhere around the world, frankly.
01:03:34.320 Um, and is a Chinese AI system, right?
01:03:37.840 And so it's starting to rival a lot of the American AI systems also because it's free
01:03:42.320 and, you know, it, uh, it kind of like shocked the world.
01:03:45.760 So right now, if you kind of look at it, um, U S and China are, are a little bit neck and
01:03:52.540 neck.
01:03:52.800 Maybe the U S and America is like a little bit ahead and you kind of like look at, you
01:03:57.820 know, if you go back to each of the three pieces that I talked about.
01:04:00.360 So, um, the chips and the computational power, the data and the algorithms, um, if you were
01:04:06.600 to rack and stack U S versus China on each one of those, you know, we're probably, we're
01:04:12.380 ahead on the computational power because the United States is the leader at developing the
01:04:17.540 chips and, and most of the most advanced chips are American chips.
01:04:20.920 Um, they probably beat us out on data, um, cause China, they've been investing into data
01:04:28.100 for a very long time as a country.
01:04:29.860 And then on algorithms were basically neck and neck.
01:04:32.820 Um, so it's a pretty tight race.
01:04:36.600 Um, and you know, to your question about does it matter or what is, what does this mean?
01:04:42.240 Um, I, uh, I think it's actually going to be one of the most important, uh, you know,
01:04:48.400 questions or most important races of our time is, is it U S or Chinese AI that wins?
01:04:54.920 Because, um, you know, AI is more than just being a tool that, uh, that, you know, we all,
01:05:01.060 we can all use to make our, you know, build whatever we want to, or make whatever ideas we
01:05:05.560 want to happen.
01:05:06.080 It's also, um, you know, it's a, it's a cultural, uh, staple, right?
01:05:12.320 You know, if you talk to an AI, that AI is kind of a reflection of our culture and our
01:05:17.660 values and all that stuff.
01:05:18.880 So in America, we value free speech and, you know, the AIs are, you know, need to, are
01:05:23.500 built to support that.
01:05:24.840 Whereas in, in China, there's, there isn't free speech.
01:05:28.780 And so, um, you know, if, if the Chinese AIs are the ones that take over the world, then
01:05:33.780 all these Chinese ideologies are going to become exported all around the world.
01:05:38.860 And, and so, so first is there's a couple of dimensions here that I think matter.
01:05:43.020 So first is just the cultural element, which is like, do we want kind of democracy and free
01:05:48.340 speech, um, to be the, the cultural AI that wins, or do we want sort of the more, um, you
01:05:54.400 know, frankly, totalitarian AIs in China to be the ones that win.
01:05:58.240 And then there's sort of the, um, there's like the, uh, you know, you start getting
01:06:03.240 to economically.
01:06:04.800 So AI is going to be something that helps, um, all the companies in the United States
01:06:10.100 thrive.
01:06:10.580 And so if the USAI wins, then we're going to, you know, the economy will grow faster.
01:06:15.960 We're going to have more and more opportunity, you know, the country will still be better
01:06:19.100 and better and better.
01:06:20.060 Um, and, and, um, the economy will keep growing.
01:06:23.200 We're versus if Chinese AI wins, then Chinese economy is going to grow way faster than the American
01:06:27.560 economy.
01:06:28.080 So there's sort of the, the cultural piece, the economic piece.
01:06:31.220 And then lastly, there's, there's the, there's kind of the warfare piece, right?
01:06:36.120 And, you know, AI, we haven't really talked about it, but has clear potential to be used
01:06:42.400 as a military technology.
01:06:44.120 And we don't want, you know, we don't want, uh, another country to have, because they use,
01:06:50.480 they have better AI to have a much stronger military than, than America's.
01:06:54.160 So like, how would they do that?
01:06:55.980 How would they have a better AI or how would they use it to have a better military?
01:07:01.300 Yeah.
01:07:01.360 How would they use it to have a better military?
01:07:03.160 Like, why is that kind of a concern or potential concern?
01:07:06.080 Yeah.
01:07:06.420 So, so one of the things that's been happening over the past, you know, decade for sure is,
01:07:12.100 uh, is lots of hacking, cyber hacking going on.
01:07:15.540 So, you know, in America, um, even recently we had this huge cyber hack called salt typhoon,
01:07:22.140 where, um, where the Chinese hacked our, uh, telecommunications company.
01:07:27.520 So hacked our, for the phone companies.
01:07:29.420 Damn, they did.
01:07:30.120 They got it.
01:07:31.060 And they got all sorts of crazy data as a result of that.
01:07:34.980 Oh, they know I'm a pervert.
01:07:36.280 I'll tell you that.
01:07:37.180 Yeah.
01:07:37.440 Look at this.
01:07:37.800 This happened in 2020.
01:07:39.380 Salt typhoon is in, is widely understood to be operated by the China's ministry,
01:07:42.520 state of security.
01:07:43.980 Um, it's foreign intelligence service and secret police Chinese embassy denied all allegations
01:07:48.780 saying it was unfound and irresponsible smears and slanders, um, high profile cyber espionage.
01:07:56.660 Um, in 2024, U.S. officials announced that hackers affiliated with salt typhoon had access
01:08:01.860 to computer systems of nine U.S. telecommunications companies later acknowledged to include Verizon,
01:08:06.840 AT&T, T-Mobile, Spectrum, Lumen, Consolidated Communications, and Windstream.
01:08:11.380 And the hackers were able to access metadata of users' calls and text messages.
01:08:15.480 Fuck, homie.
01:08:16.540 We're fucked, dude.
01:08:17.340 I am.
01:08:17.920 You seem good.
01:08:19.180 Including date and timestamps, source and destination IP addresses.
01:08:24.000 Ah, shit.
01:08:25.200 Keep going.
01:08:25.700 Keep going.
01:08:26.860 And phone numbers from over a million users, most of which were located in Washington, D.C.
01:08:30.780 Good.
01:08:31.920 Light them up, dude.
01:08:33.480 In some cases, the hackers were able to obtain audio recordings of telephone calls made
01:08:36.940 by high profile individuals.
01:08:38.940 Such individuals reportedly included staff of the Kamala Harris 2024 presidential campaign,
01:08:44.640 as well as phones belonging to Donald Trump and J.D. Vance.
01:08:47.740 According to Deputy National Security Advisor Ann Neuberger, a large number of the individuals
01:08:52.520 whose data was directly accessed were government targets of interest.
01:08:56.200 Wow.
01:08:57.320 Yeah.
01:08:58.080 That's crazy.
01:08:59.120 So do you think this also, that that whole thing could be not real and it's just a story
01:09:04.400 that was created?
01:09:05.040 That seems pretty real.
01:09:07.820 Okay.
01:09:08.100 Because there's real, like, I mean, there's like 20 stories where the Chinese have hacked
01:09:14.940 American systems.
01:09:16.240 Like they hacked, this was, this must have been close to 10 years ago now, but the Chinese
01:09:21.620 hacked the database in America that stored all of the clearances.
01:09:27.420 So they hacked in, they, they managed to hack into knowing who are literally all of the Americans
01:09:33.800 who have security clearance.
01:09:35.380 Oh, security clearance.
01:09:36.600 Yeah.
01:09:36.820 I thought you meant what was on sale.
01:09:38.320 I was like, who cares?
01:09:40.780 That's great though.
01:09:41.760 Damn.
01:09:42.220 Oh, damn.
01:09:43.060 So they knew everybody who had, who knew.
01:09:46.160 Oh, who knew information.
01:09:47.260 Who knew secrets.
01:09:47.920 Yeah.
01:09:48.220 So once they knew that, then they know it.
01:09:49.940 Well, that's a great point of operation to go then.
01:09:52.160 Well, now let's get, find their data.
01:09:53.700 They can just hack all of them.
01:09:54.760 Yeah.
01:09:55.220 So, so they, so already China is hacking the shit out of America.
01:10:00.260 That is definitely not an understatement.
01:10:02.120 Yeah.
01:10:02.640 It's exciting kind of, I mean, it's unfortunate, but it's also exciting.
01:10:05.380 I like some espionage, you know, I can't sleep unless somebody's fucking really going
01:10:09.400 through it.
01:10:11.100 But then, but then, so this is.
01:10:13.200 But that's a real AI thing.
01:10:14.140 So AI can be used to do that because you can like prompt it to go and do things like that.
01:10:19.940 Yeah.
01:10:20.280 Yeah.
01:10:20.480 Yeah.
01:10:20.620 There's, there's a bunch of recent.
01:10:22.040 Cause it's all data.
01:10:22.880 Yeah.
01:10:23.280 There's a bunch of recent demonstrations where AI is just like how in, in go, how AI beat
01:10:29.620 the world's best go players, AI starting to beat the world's best cyber hackers and
01:10:35.280 the world's best.
01:10:36.300 Yeah.
01:10:36.780 So it's a, I don't know if you saw Mr. Robot.
01:10:39.940 I didn't, but I DMed with Magnus, Magnus Carlson.
01:10:42.920 No, that's cool.
01:10:43.720 Pretty cool.
01:10:44.220 So, but no, Mr. Robot, is it good?
01:10:46.840 It's, it's just, it, it shows like all this hacking stuff and like, you know, cool.
01:10:50.780 It makes it seem really cool.
01:10:52.380 Okay, cool.
01:10:52.860 I'm going to check it out.
01:10:53.880 But, but yeah, no hacking, like, like you're going to have AI that are hacking everything
01:10:58.820 in America.
01:10:59.340 And this is one place where this is like US versus China will be really come to life,
01:11:04.740 which is who has better AI that's better at defending against the hacks from the other
01:11:09.120 guy, as well as hacking the other, the other guy's systems.
01:11:11.880 Um, that's going to be, that'll just, that'll just start happening.
01:11:15.420 That's basically starting to happen right now or it's, you know, cyber warfare has been
01:11:18.820 happening and then AI cyber warfare is going to start happening.
01:11:22.440 Yeah.
01:11:22.900 Basically as soon as, you know, as AI gets better.
01:11:26.140 Yeah.
01:11:26.620 We had a Craig Newmark who created Craigslist on.
01:11:29.920 Yeah.
01:11:30.120 And he was talking about how, what if they hacked like everybody's Teslas to all just drive
01:11:36.840 off a cliff one day, or they hacked everybody's ovens to go up to 400 degrees in the middle
01:11:41.680 of the night while you're sleeping.
01:11:42.740 And then fire started like just things like that, that you don't start to think about
01:11:46.540 that.
01:11:47.300 Um, once something's connected to the grid or something like that are connected through
01:11:50.600 routers and wifi is that, that, that could be feasible.
01:11:53.060 Yeah, no, it's, there's a lot of, um, there's a lot of things they could do that won't even
01:11:57.480 like seem like that big a deal at the time, but could be really, really could, could
01:12:03.200 be a big deal.
01:12:04.020 So for example, let's say the Chinese, like they just took out all of the military, uh,
01:12:09.980 like communication systems and all the military software systems, um, like took out the satellites,
01:12:14.880 took out all that for like 10 minutes.
01:12:16.700 And in those 10 minutes, they like, you know, invaded somewhere or they like did some crazy
01:12:21.140 thing.
01:12:21.860 Like they can just, there's, there's the, the thing about, um, about this stuff is like
01:12:27.960 everything at, you know, as the world's become more connected, it also enables, you know,
01:12:33.180 different kinds of warfare.
01:12:35.240 So, uh, so cyber warfare is really big.
01:12:37.760 Also like the, um, uh, you know, uh, information warfare is another big one.
01:12:43.980 So what does information warfare mean?
01:12:45.660 So this is information warfare is all about, you know, um, in a place, what is, what are
01:12:52.720 like the stories?
01:12:53.600 This is kind of gets to like, you know, the, like propaganda or, you know, these conspiracy
01:12:57.920 theories, like what are the stories that, um, in a place that we're trying to make happen
01:13:04.280 or not make happen.
01:13:05.120 And we know that China does a bunch of information warfare called IW it's sometimes called, but
01:13:10.040 they, they have a, they have whole operations.
01:13:12.620 This is actually the craziest part.
01:13:13.760 They have like, they've, they've hired the, the Chinese military at various points has hired
01:13:19.400 millions and millions of people who are supposed to be on like various, like chat groups and
01:13:25.120 WhatsApp groups and WeChat groups and whatnot, and just, um, spread the right kind of stories
01:13:31.300 that'll make it such that they can like, um, they can make their political aims happen.
01:13:36.200 So for example, in, uh, when, when China wanted to start like, kind of like, um, uh,
01:13:43.380 I don't know what the word is like, uh, like, uh, when Hong, when they, when China wanted Hong
01:13:48.480 Kong to become a part of China again, which happened just not to, not to, you know, pretty
01:13:53.280 recently PRC, right?
01:13:54.620 To the PRC.
01:13:55.500 Exactly.
01:13:55.860 When they want to propaganda, is that the word you're looking for?
01:13:57.620 Yeah, they would, they would, yeah, exactly.
01:13:59.500 They would use a lot of propaganda and that's information warfare to be able to just make
01:14:03.940 it such that that all happened much more easily.
01:14:06.800 What's it's unbelievable.
01:14:07.740 I'll see stories even about, I'll be going through TikTok and see a story come up about
01:14:11.060 something in my life that is not even true.
01:14:13.460 Insane.
01:14:13.940 Some of it looks fun, but never was a part of my existence.
01:14:17.240 And then you'll see hundreds of people have said something about like, and they'll, and they'll
01:14:21.160 have friends that'll ask me about it.
01:14:22.460 I'm like, that's just crazy.
01:14:23.540 Like, totally.
01:14:24.640 But I, so yeah, it's amazing to think of how many things we're watching or absorbing
01:14:29.520 that are just, are, are created just to dilute us.
01:14:38.660 Yeah.
01:14:38.940 I don't know if dilute is the word, is it?
01:14:40.680 Trick us or make us think something.
01:14:42.660 Just to fucking Halloween us out.
01:14:44.080 Um, wow.
01:14:46.700 So there was a lot of interesting things.
01:14:48.900 You know, what's crazy, man?
01:14:50.380 Some things makes life scary, but then it also makes it interesting.
01:14:53.120 You know, it also makes it interesting in a, in a, in a, in a fun way.
01:14:56.520 Um, how do we, how much do we have to fear, say if a certain country or a certain company
01:15:03.340 owns an AI right in that country and that company, um, if they're Chinese, if they, um,
01:15:10.940 have a certain religious belief or they have, uh, information that they don't, they want
01:15:17.980 to adjust history, how much would a company be able to like, say they keep certain data
01:15:24.780 out of their information system, but, and then after a while, if you're, if that's a company
01:15:31.180 that kind of takes the lead in AI or one of the main ones, then the truth could disappear.
01:15:39.020 Is that true that if somebody loaded it just with the data that wasn't factual, that we could
01:15:44.840 start to not have the truth, is that, does that make any sense or no?
01:15:49.300 Yeah, totally.
01:15:49.860 I think this is, this is something that, um, it's definitely the right thing to worry about.
01:15:54.200 So, so first off, if you ask any Chinese AI system, so any AI system that comes out of
01:16:01.220 China, if you ask any of them about, you know, a question about President Xi, the, you know,
01:16:07.120 the, the leader of the Chinese government, or you ask them any question about, you know,
01:16:11.160 Tiananmen Square, or, you know, all these like key historical or, or, you know, cultural
01:16:16.520 things relevant to China, um, it'll say it can't talk about them because there's regulation
01:16:22.700 in China that if you talk about some of these things, like you're, you're going to get shut
01:16:27.780 down, you're going to have a really bad day.
01:16:29.220 There's like cases where, um, the Chinese government disappears people, um, which we don't know what
01:16:35.100 happens to them, but they do disappear.
01:16:36.300 So there's the, the, this is part of the thing that's worrying, especially about, um, China
01:16:43.320 versus us, even before you get into any of the military stuff that we're talking about,
01:16:47.660 it's just like the Chinese, Chinese AI systems are censored and are going to, you know, be,
01:16:55.480 you know, they're going to, they're going to erase certain historical elements or they're
01:16:59.200 going to be, um, yeah, look at that.
01:17:01.900 This is deep seek.
01:17:02.640 You know, you ask it, is president Xi of China a good guy?
01:17:06.580 Sorry, that's beyond my scope.
01:17:07.940 Let's talk about something else.
01:17:09.000 Oh, let's talk about something.
01:17:09.840 Not only does it say it's beyond my scope, it says, let's talk about something else.
01:17:12.640 That's why, that's interesting.
01:17:14.760 It's a good pivot.
01:17:15.400 That's a good, that's a great, Hey, let's talk about something else, huh?
01:17:19.540 Wow.
01:17:20.000 Get this Yao Ming Jersey, homie.
01:17:22.560 Um, but, and people always, people also have to remember about China that they are, that's their
01:17:27.820 whole government.
01:17:28.560 Their whole system is like that.
01:17:29.780 So sometimes when people are like China does this, but that's how they're built, right?
01:17:33.480 They're built to like only give out certain information to their people and to, um, have
01:17:38.580 communism, right?
01:17:39.680 Yeah.
01:17:40.240 Yeah.
01:17:40.620 So, I mean, but that could also happen with American companies, right?
01:17:43.440 We can have an American company that owns it and they only want certain information in
01:17:46.580 there.
01:17:46.760 That could happen anywhere.
01:17:47.480 Like China, that's probably going to be, cause that's their MO sort of.
01:17:51.080 Yeah.
01:17:51.200 In China it's regulated.
01:17:52.640 So basically like.
01:17:54.780 Oh, the government has control.
01:17:55.860 They have control.
01:17:56.500 So, so, um, like there were, there are these stories about how there were Chinese news
01:18:02.240 sites, um, news sites, news sites.
01:18:04.920 Yeah.
01:18:05.100 And they would, once a Chinese news site, um, accidentally led an article about, uh, President
01:18:12.280 Xi, how he kind of looks like Winnie the Pooh.
01:18:14.920 Um.
01:18:15.360 Oh yeah.
01:18:15.840 They let that.
01:18:16.540 Bring him up.
01:18:17.060 Oh, a hundred acre wood gang, son.
01:18:20.320 I was out there, boy.
01:18:22.100 I was out there, bro.
01:18:24.300 Christopher Robbins, dude.
01:18:26.260 Get him up.
01:18:28.140 Oh, he does.
01:18:29.800 Yeah.
01:18:30.340 That's awesome.
01:18:31.640 Yeah.
01:18:32.120 But if you talk about this in China, you like are risking your life.
01:18:36.200 No.
01:18:36.300 So what happened, what happened when this happened, this happened on a, on a news site
01:18:39.660 in China.
01:18:40.220 And then the, uh, the, the CEO of that company, like they shut down the whole app was shut
01:18:46.760 down for like a week, um, in the aftermath of that.
01:18:49.720 And then the, the CEO disappeared for a week and, uh, we don't know what happened to him.
01:18:55.460 But then as soon as he came back, he was like, there was like this weird video where
01:18:59.280 he was like, you know, super apologetic and apology.
01:19:01.680 I mean, it's, it's kind of, it's pretty scary.
01:19:04.800 Yeah.
01:19:06.080 Wow.
01:19:07.280 So, um, so in China there, it's like, this is the government has control.
01:19:13.580 You know, you don't have AI, CIS companies, AI, any companies that can, that can talk about
01:19:20.240 this stuff.
01:19:20.800 Right.
01:19:21.120 So it's heavily regulated there where it's not the, that's not the case here.
01:19:23.960 Yeah.
01:19:24.120 In America.
01:19:24.880 And this is, I think we have to be diligent and make sure this continues to be the case.
01:19:28.740 But, and here's an example right here.
01:19:30.000 Just to interrupt you, but so we get at the point in, does Winnie the Pooh look like any
01:19:34.360 world leaders?
01:19:35.060 And that's on the Chinese version.
01:19:37.220 Uh, and it says, I am sorry.
01:19:38.240 I can't answer that question.
01:19:39.140 I'm an AI assistant designer to provide helpful and harmless responses.
01:19:42.620 Whereas the chat GBT says, Winnie the Pooh has often been compared to world leaders,
01:19:46.700 particularly Xi Jinping, Xi Jinping, president of China.
01:19:50.480 Boy.
01:19:51.200 Wow.
01:19:53.360 So that's funny, but it's just funny.
01:19:56.080 Yeah.
01:19:56.280 One.
01:19:56.900 So it just shows you how that can easily happen.
01:19:59.000 And this is kind of a, this is like a, uh, a relatively innocuous example, but.
01:20:04.120 What does innocuous mean?
01:20:05.120 Like, uh, it's relatively harmless.
01:20:06.880 Like this isn't, I mean.
01:20:07.840 Right.
01:20:08.060 This is harmless.
01:20:08.740 Yeah.
01:20:08.900 This is harmless.
01:20:09.620 But there's stuff where like, um, like in China today, they have large scale, effectively
01:20:15.820 concentration camps and reeducation camps for the ethnic minority in China, the Uyghurs.
01:20:21.840 And that's something that.
01:20:23.740 The Uyghurs?
01:20:24.480 The Uyghurs.
01:20:25.360 Yeah.
01:20:25.600 Hell yeah, boy.
01:20:27.740 Shout out Brian Purvis, dude.
01:20:30.280 They're recognized as the titular nationality of the, um, of a region in Northwest China.
01:20:36.820 And they've, they're sending them to rehabilitation camps to change their views and information.
01:20:41.060 Yeah.
01:20:41.340 Yeah.
01:20:41.480 So look at this, this, uh, look at this guy.
01:20:43.980 Persecation of the Uyghurs in China.
01:20:46.240 Since 2014, the government of the PRC, People's Republic of China, has committed a series of
01:20:51.340 ongoing human rights abuses against the Uyghurs and other Turkish Muslim minorities in
01:20:56.420 Xinjiang, which has often been characterized as persecution or as genocide.
01:21:01.140 Wow.
01:21:01.920 They got their own Gaza rocking over there.
01:21:04.080 It's pretty bad.
01:21:04.880 It's unfortunate, man.
01:21:05.700 It's pretty bad.
01:21:06.040 It's really sad.
01:21:06.740 Mass detention, government policies, and forced labor.
01:21:09.380 And they're just trying to change the way that they think and view stuff.
01:21:12.520 So it's basically, it's.
01:21:14.540 Yeah.
01:21:14.660 It's just like erasing their culture, you know, pulling them into China.
01:21:18.900 Um, it's awful.
01:21:20.540 Every place has done this over the years.
01:21:22.120 And that's just, that's the craziest thing about history.
01:21:24.240 It's like every place is guilty of this same thing.
01:21:27.720 Totally.
01:21:28.000 And it just, it's, it's unfortunate.
01:21:30.260 So it's, it's hard to point fingers, you know, I mean, you can point them, but you have to
01:21:33.620 point one at your own people as well.
01:21:35.800 But that's the thing where if you ask, like it, if you ask a Chinese AI, it's not going
01:21:40.800 to tell you about that.
01:21:41.680 And it won't, it won't come clean about that.
01:21:44.160 Whereas thankfully in, in America, at least when we see people or groups of people or countries
01:21:51.520 doing bad things, we can call it out.
01:21:53.060 We can talk about it.
01:21:53.920 We can make sure it doesn't happen in the future.
01:21:56.040 Um, so that's part of the, uh, that's one of the things that's, that could happen.
01:22:01.680 That could happen.
01:22:02.320 It's like you, you could have, I mean, it's kind of dystopian, but you know, I think there's
01:22:06.900 a real case where let's say the Chinese AI is the winning AI.
01:22:10.700 Like we're all using Chinese AI.
01:22:12.460 And then all of a sudden we're like, we were shut out from information about what are like
01:22:18.020 awful things happening in the world or what awful things the government's doing.
01:22:20.720 Like we might just not be able to know about what's going on.
01:22:24.200 And you know, what's weirdly, and I hate to say this, maybe it, maybe it's silly.
01:22:28.460 I don't know.
01:22:28.940 It might be a blessing and a curse.
01:22:30.460 And sometimes, cause sometimes it's like, you're, you're overwhelming.
01:22:34.420 Yeah.
01:22:34.760 You're so inundated with the overwhelmingness of what's often is not the best stuff.
01:22:40.180 Sometimes you get a lot of humor stuff too, in social media reels, but you can get scrolling.
01:22:44.340 You get caught in some zoom scrolling.
01:22:46.200 Yeah.
01:22:46.520 And it starts to feed you.
01:22:48.200 That's the sickness of it.
01:22:49.720 Yeah.
01:22:49.840 It's like, Hey, we, you, this isn't, we know this information probably to make you feel
01:22:53.180 good.
01:22:53.400 They're not thinking about it like that.
01:22:54.420 They're just a machine, but you know, it doesn't, it adds stress to your, it makes you
01:22:57.520 agitated towards a group or ethnicity or something or yourself even.
01:23:02.320 And then you continue to, it continues to feed it to you.
01:23:06.160 Do you fear that that could happen to AI from our government?
01:23:09.760 Like, have you been approached by the government to try and cause you work with the government
01:23:13.280 some, right?
01:23:13.780 We work with the government.
01:23:14.480 Yeah.
01:23:15.940 We work, yeah, we work a lot with, with the government to make sure that they're using
01:23:20.380 these AIs and they're actually like, you know, to, as to my point on, we don't want
01:23:25.060 trying to get the jump on us on AI use for all these, you know, all these nefarious purposes.
01:23:30.400 So we got to make sure that our AI is, is, is advancing faster.
01:23:34.920 Is that one of your biggest employers or is that employer employee?
01:23:39.020 Is that one of your biggest?
01:23:39.660 Customers, customers.
01:23:40.380 Is that one of your biggest customers?
01:23:41.880 They're a big one.
01:23:42.780 Yeah.
01:23:42.920 They're a big one.
01:23:43.960 Um, not our biggest, but, uh, but they're an important one.
01:23:46.680 I mean, I, I grew up in a government lab town, so it's, uh, so it's just part of, it's also
01:23:52.800 part of your existence.
01:23:53.880 Really?
01:23:54.160 You've known about the relationship between government and, um, technology and technology.
01:23:58.740 Yeah, totally.
01:23:59.720 Wow.
01:24:00.960 But, uh, no, I don't think, I mean, I, dude, you should be a superhero almost, dude.
01:24:05.360 It's kind of crazy.
01:24:06.500 Math, you know?
01:24:07.340 Yeah.
01:24:07.800 It goes a long way.
01:24:08.540 Oh, hell yeah, dude.
01:24:10.280 Divide these nuts, dude.
01:24:11.720 That's what I tell them.
01:24:12.520 I just asked, uh, DeepSeek, who are the Uyghurs?
01:24:16.320 And at first it spit out like a Wikipedia response.
01:24:19.240 It said there were people and there's been like persecution from China.
01:24:22.480 That's, that's, uh, debated and it refreshed.
01:24:24.980 And then it gave this, I was waiting to pull it up and it went away.
01:24:28.100 Wow.
01:24:28.860 Yeah, man.
01:24:29.240 Um, do you, has the government tried to say that we need to make sure that like, could
01:24:34.120 that happen in our country where the government also curtails?
01:24:36.460 What's, um, it hasn't happened yet.
01:24:39.620 I obviously like, you know, you gotta, we have to, we have to make sure that we uphold all
01:24:43.960 our values.
01:24:44.420 Right.
01:24:44.640 And that we maintain free speech and we maintain free press and all these things.
01:24:48.420 But, um, as of right now, no, I don't think, I don't think that's a risk in the, in the
01:24:52.940 United States.
01:24:53.880 Awesome.
01:24:54.360 Thanks for this information.
01:24:55.640 Um, you hear about like chip makers, NVIDIA all the time, Taiwan, that place is just a
01:25:01.260 hotbed for, uh, chips.
01:25:03.880 Why is it a hotbed for chips?
01:25:06.360 Yeah.
01:25:06.620 So one of the, the biggest companies in the world is this company called, uh, Taiwan
01:25:12.460 Semiconductor.
01:25:13.740 Yeah.
01:25:13.960 TSM.
01:25:14.540 I've seen them.
01:25:15.200 TSMC.
01:25:15.720 Yeah.
01:25:16.000 So they're, they're, um, I mean, it's, it's like a trillion dollar company, uh, based in
01:25:20.660 Taiwan.
01:25:21.280 And it's, uh, that is where almost all of the high-end chips for AI that, you know, we're
01:25:27.600 kind of, we were kind of talking about, all of them are manufactured there.
01:25:30.920 They have these, they have the most advanced, think about them as factories, like the most
01:25:35.120 advanced chip factories in the world.
01:25:37.160 They're called fabs or fabricators, but basically these huge factories that are like, you know,
01:25:42.280 there, there's all sorts of crazy stuff.
01:25:43.980 So, um, they have the most expensive machines in the world.
01:25:46.760 They're machines that cost hundreds of millions of dollars in there.
01:25:49.360 They have, um, they build them because, uh, so that, you know, the chips, they have to
01:25:55.260 be made at, at the like finest levels and very, very precisely.
01:25:59.020 Yeah.
01:25:59.220 You need small hands.
01:26:01.060 Probably, huh?
01:26:02.040 You need, well that, and there's like these machines that, um, that, uh, that,
01:26:05.120 at the nanometer level make like little marks and etches on top of, uh.
01:26:09.980 And they have those?
01:26:10.780 They have those.
01:26:11.380 Yeah.
01:26:11.520 Those are super expensive machines.
01:26:13.080 So that, so, um, the, it's, it's crazy.
01:26:16.880 Yes.
01:26:17.160 But the, but it's so, it's the, the machinery is so precise that, um, even if there's like
01:26:23.520 a little bit of like seismic movement, a little earthquake or a little bit of movement, it
01:26:28.040 can fuck up the whole machine.
01:26:29.080 So they have to build the, the buildings, build the factories in a way such that there's
01:26:35.340 like, like the whole building doesn't move, even if there's like a little earthquake or
01:26:40.020 a little shake from the earth.
01:26:41.280 So it's like, it's this crazy, crazy engineering.
01:26:45.100 Um, and so that's, so these, all these giant factories are in Taiwan and that's where basically
01:26:51.320 like a hundred percent of all the advanced AI chips are made.
01:26:54.800 So that's why Taiwan matters so much.
01:26:56.860 Got it.
01:26:57.040 But then the, the reason it's a hotbed is that, um, uh, the, the People's Republic of
01:27:03.560 China, the PRC, uh, has a very.
01:27:06.700 They used to own Taiwan, right?
01:27:08.240 Yeah.
01:27:08.680 I mean.
01:27:08.940 Is that true or not?
01:27:09.440 I might make that up.
01:27:10.420 There's a, there's a complicated relationship between Taiwan and China where, you know,
01:27:15.660 if you ask people in Taiwan, they, they want to be independent.
01:27:18.140 They want to be their own country, but, um, but the People's Republic of China has a, um,
01:27:24.860 sort of a reunification plan that they want to bring Taiwan back into their country and
01:27:30.440 be back a part of China.
01:27:31.640 So it's kind of, you know, it's kind of like it potentially, you know, thankfully there's,
01:27:35.500 there's no war yet, but there's a risk.
01:27:38.400 Still talking to your ex.
01:27:40.020 Yeah, exactly.
01:27:40.800 There's a risk.
01:27:41.320 It becomes like Russia, Ukraine, or, you know, one of these really, really bad situations.
01:27:45.860 Um, so, so that's what's scary.
01:27:49.140 What's scary is that, um, that, that China, a China wants to, you know, either invade or
01:27:56.160 bring, bring Taiwan back into its country.
01:27:59.560 Um, and there've been, you know, um, president Xi has, has ordered his military to get ready
01:28:06.740 to do that before 2027.
01:28:08.320 Um, now we don't know what's going to happen, but, you know, if a extremely powerful world
01:28:14.800 leader says to get something ready by 2027, you kind of, you know, read between the lines
01:28:19.200 a little bit.
01:28:19.940 Um, and, um, and that's part of it is, is, uh, obviously it'd be, you know, we don't want
01:28:26.480 to enable them to, to take over this island.
01:28:29.220 But then the other thing that's scary is, um, China may view it as a way to just like
01:28:35.440 win on AI, because if they take over the island with all of these very, these giant factories,
01:28:41.700 all the chips, baby, they'll get all the chips, Frito Lamborghini, baby, they'd be running
01:28:45.500 it all.
01:28:46.240 They'd be running it all.
01:28:47.180 Yeah.
01:28:47.520 Wow.
01:28:48.320 So the, yeah, that's why Taiwan is, yeah.
01:28:49.980 Cause you kind of hear about it in the whispers of like a potential place where there could be
01:28:54.060 like a conflict.
01:28:55.060 Yeah.
01:28:55.640 And there's, there's all these reports about how, um, China's, they're, they're stacking
01:29:00.960 up tons and tons of military, um, right on their, their, their coast to, you know, that's
01:29:08.260 pointed directly at Taiwan.
01:29:09.900 And it's, it's pretty close.
01:29:11.960 Taiwan's pretty close to China.
01:29:13.080 Like it's, it's, uh, it's not so far away.
01:29:16.640 So.
01:29:17.280 Whew.
01:29:17.880 That'd be spooky.
01:29:18.840 That's spooky.
01:29:19.340 Yeah.
01:29:19.540 It's spooky.
01:29:20.180 We're so blessed to have a place where at least we can sleep in peace, even if we're
01:29:23.980 uncomfortable at times in our brains, you know, to not have that constant threat.
01:29:28.280 Yeah, totally.
01:29:29.380 Yeah.
01:29:29.580 So you don't think, you don't think that you don't worry that the government will regulate
01:29:31.840 right now.
01:29:32.780 It's not, it's not a concern at the moment.
01:29:34.840 Uh, regulate AI.
01:29:35.880 Yeah.
01:29:36.100 And America.
01:29:36.860 No, I don't think so.
01:29:37.800 Okay.
01:29:37.960 I think, I think, uh, I think we're focused on how do we make sure that America wins?
01:29:43.620 How do we make sure that, uh, that the United States comes out on top and that we enable
01:29:48.920 innovation to keep happening?
01:29:50.960 Would you think they could regulate the amount of chips that you're allowed to have?
01:29:53.980 So this is a hot topic globally, actually, which is, yeah, yeah.
01:29:58.320 Wow.
01:29:59.120 This is a super, this is actually.
01:30:01.560 Damn.
01:30:02.520 Finally, dude.
01:30:04.640 560 interviews.
01:30:05.800 We got a good question.
01:30:07.460 This is one of the hottest topics in DC right now, uh, is what, what are we going to do about
01:30:13.800 how many chips other people are allowed to have?
01:30:16.180 Because, because almost all the chips are American chips.
01:30:20.000 So they're all, they all are American chips.
01:30:23.680 And technically.
01:30:24.540 What do you mean we own most of them?
01:30:25.820 Yeah, exactly.
01:30:26.560 But China owns most of them too.
01:30:27.980 They're, China has their own, uh, has their own chip industry, but it's behind ours.
01:30:32.420 Okay.
01:30:32.720 Got it.
01:30:33.100 Yeah.
01:30:33.280 Yeah.
01:30:33.360 So, so the United States has the, has the most advanced chips as the, you know, these
01:30:37.800 chips are the envy of the world.
01:30:39.160 Everybody in the world wants our chips.
01:30:41.380 And, uh, the, one of the big questions is, are, you know, do we, does the government allow
01:30:48.160 a lot of these chips to go over overseas to China or parts of Asia or the Middle East or
01:30:53.660 wherever, or do we want to make sure they stay in America and make sure that we win in
01:30:58.920 America and this is a super duper, you know, they're called export controls.
01:31:04.660 Yeah.
01:31:05.140 Um, cause it's a possibility to run it all.
01:31:08.420 Yeah, exactly.
01:31:08.860 Who's got the chips.
01:31:10.020 What do you think about it?
01:31:11.940 Um, it's a complicated, complicated thing because basically, you know, one argument is,
01:31:18.980 um, we shouldn't be throwing our weight around in this way.
01:31:23.000 Like, you know, maybe it's, it's fine.
01:31:25.360 It's a free market.
01:31:26.120 Like if other people want our chips, they should be able to get our chips.
01:31:28.800 And that way, you know, the world is running on American chips that, that can be good in
01:31:34.560 some ways.
01:31:35.100 And it helps make sure that, you know, helps bolster our economy, our industry.
01:31:38.800 Um, but the other way to look at it is, Hey, AI is really, really important that America
01:31:44.240 wins at.
01:31:44.900 And we don't want to like, let's not give other people any advantages or let's make sure
01:31:50.860 that we win and then, and then we can figure out what we're going to do with all the chips.
01:31:54.420 So it, you can see both sides of it.
01:31:56.920 Right.
01:31:57.500 And there's like all sorts, you know, even beyond that, there's like 50 different arguments
01:32:01.720 on both sides of the, of the conversation.
01:32:04.540 But, um, you know, where I go, where I come from on it is like, let's, let's make sure
01:32:09.600 America wins and let's start from there and then figure out what we need to do to win.
01:32:15.860 Are there uses of AI that you feel like cross the line kind of?
01:32:19.360 Um, I definitely think like, uh, well, I, I worry a lot about this kind of like, uh,
01:32:29.220 you know, maybe brainwashing kind of thing.
01:32:31.440 Like, I don't want, I don't want AIs that are specifically programmed to make me think
01:32:38.880 a certain thing or persuade me to do a certain thing.
01:32:42.060 And that could happen.
01:32:42.700 That could happen.
01:32:43.500 Yeah.
01:32:43.760 So I'm really worried about this kind of like deception and persuasion from AIs.
01:32:49.280 Like I don't want AIs that are lying to me or, uh, that are sort of, that are like, that
01:32:55.640 are kind of like nudging me or, or persuading me to do things that I don't want to do, or
01:33:00.100 I shouldn't be doing.
01:33:01.060 That's what I worry about.
01:33:02.160 Because it could happen.
01:33:02.900 We don't realize how easily we're influenced little things that influence us.
01:33:06.340 Yeah.
01:33:06.800 And even just a turning of a phrase or a little bit of this, or pointing you to a couple
01:33:11.600 of lengths in your life could lead you down a whole world.
01:33:14.080 It's kind of, it's pretty fascinating.
01:33:16.360 So people that could, people that had, how do you keep your AI clean?
01:33:22.800 How do you guys keep your AI clean?
01:33:24.940 Well, this is where it goes back to a, the data.
01:33:27.220 Okay.
01:33:27.440 So you got to make sure that that data, to your point, the large body of water is as
01:33:31.980 clean as, as, as, um, as pristine as possible.
01:33:36.120 You got lifeguards on it.
01:33:37.100 Yeah.
01:33:37.340 You got lifeguards.
01:33:38.340 We got filters.
01:33:39.140 We got, we got game wardens.
01:33:41.440 Yeah.
01:33:42.320 Um, so, so the big part of it is about the data.
01:33:45.000 And then the second part is, I think we have to just, we have to constantly be testing the
01:33:49.020 AI system.
01:33:49.560 So, um, we have to, we have to like, we constantly are running tests on AI to see, Hey, is there,
01:33:56.880 are they unsafe in some way?
01:33:59.000 You know, one of the, one of the tests that we run a lot, uh, is, uh, and this is like,
01:34:04.320 you know, across the industry is like, um, are AI's helping people do really nefarious
01:34:10.840 things and we're making sure that they don't.
01:34:12.580 So, you know, if somebody asks an AI, Hey, help me make a bomb or help me make like a,
01:34:18.440 like COVID like 2.0 or whatnot, that the AI is not helping you do that.
01:34:23.860 So, um, so we run a lot of tests to make sure that it doesn't help in those areas.
01:34:28.320 And then, um, and then we make sure that the data is really clean so that there's no, there's
01:34:34.040 no sort of like little bit or piece of that that makes its way to the model.
01:34:38.540 With Outlier, that's your program.
01:34:40.400 Yep.
01:34:40.680 With Outlier, how are you, how are, what type of people are applying for those jobs?
01:34:45.360 Can people just log on and start to submit applications?
01:34:48.500 Like, how does that work to become a, um, information sourcer?
01:34:53.060 Yeah.
01:34:53.420 We call them contributors.
01:34:54.680 Okay.
01:34:54.940 Information contributor.
01:34:56.160 Everybody's kind of contributing to the AI's, everyone's contributing to the data, um,
01:35:00.100 that goes into the AI's.
01:35:01.400 Um, it's kind of like, I almost think of it as like the next generation of Wikipedia, right?
01:35:06.300 We're like, um, yeah, look at this.
01:35:08.560 And we're hiring people all around the world.
01:35:09.960 So people in all sorts of different languages.
01:35:12.580 Um, dude, that's crazy, man.
01:35:15.240 Yeah.
01:35:15.480 Well, it turns out, by the way, most of the AI's don't really, um, speak other languages
01:35:21.460 that well.
01:35:22.260 They're much, much better at, uh, at English and, and, uh, particularly American English
01:35:29.040 than, uh, than other languages.
01:35:32.220 And so we want to make sure that they speak all these languages well and that there's these
01:35:35.840 opportunities, but yeah, on Outlier.
01:35:37.600 So, um, anybody in the, around the world can log in and sort of, there's a little bit of
01:35:44.040 a, of a, of a, um, like orientation almost on, uh, on how to best, um, like what you're
01:35:51.540 supposed to do, how you're supposed to do it, what expertise should you be bringing, all
01:35:54.880 that kind of stuff.
01:35:55.420 And then, and then you can just start, um, contributing to the, to the AI models and you
01:35:59.360 get paid to do it.
01:36:00.360 Wow.
01:36:00.900 Yeah.
01:36:01.220 It's pretty fascinating, man.
01:36:02.820 And it's going to be, I mean, I really think, I legitimately think jobs from AI are going
01:36:08.080 to be the fastest growing jobs in the world for the years to come.
01:36:12.340 You do?
01:36:12.460 Yeah.
01:36:13.040 Like what, so jobs where people are able to contribute information jobs where people are
01:36:16.820 able to like, what, like what would examples of those be just some of the ones you've already
01:36:20.040 listed?
01:36:20.520 Yeah.
01:36:20.640 All the ones we've been talking about, right?
01:36:21.760 Like contributing to the AIs, um, helping to utilize the AIs and helping to, to shape
01:36:27.600 the AIs into, into applications or into, you know, uh, into in like helping organizations
01:36:34.280 or companies or people use the AIs.
01:36:36.800 That'll be a really fast growing job.
01:36:38.300 Um, helping to, to manage the AIs and make sure they're, they're, they're on the straight
01:36:43.420 and narrow.
01:36:44.460 Where would a young person go right now?
01:36:45.860 Who's some, who's getting to college or has, doesn't even want to go to college, but this
01:36:49.700 is the world they want to get into and be one of those people.
01:36:51.820 What would, do they do right now?
01:36:54.000 Yeah.
01:36:54.140 Well, this is all happening so fast, right?
01:36:55.980 Like the, uh, the, uh, like, uh, outlier, we only started a few years ago.
01:37:01.540 So all of this is happening so, so quickly, but what we want to do ultimately is make it
01:37:06.060 easy for anybody in the world to, you know, gain the skills they need to be able to do
01:37:12.220 this work well, to learn what it means, what it does, and ultimately be in a position where
01:37:16.360 they can, they can, you know, help build the AIs and then, and then keep improving that
01:37:21.160 and gaining mastery and getting better and better at it.
01:37:23.440 But like, where do they go to school?
01:37:24.580 Is there a school?
01:37:25.340 Is there a class that they should take online?
01:37:26.860 Like how does someone's start to become, you know, just get a head start on what could
01:37:33.240 potentially be probably a lot of job opportunities, I'm guessing.
01:37:36.020 Yeah.
01:37:36.520 Like in the AI space, right?
01:37:38.460 Yeah.
01:37:38.640 Like, is it just, is it just engineers?
01:37:40.640 Like, is it just mathematicians?
01:37:42.360 Like, no, it's everybody because yeah, as we were talking about, like AI needs to get
01:37:46.680 smarter about literally everything.
01:37:49.480 So, uh, but are there colleges offering court?
01:37:51.620 Like, is there, do you know, like, is there a specific places where people can, cause that's
01:37:55.220 another thing I think it's like, I'm going to work in AI and you're like, what do I do?
01:37:58.300 You know?
01:37:58.960 Yeah.
01:37:59.600 Yeah.
01:37:59.840 I, I don't think these programs exist yet.
01:38:01.640 I mean, we, we would definitely love to help build them.
01:38:04.660 Um, so I guess if any colleges are listening, you know, and you want to help figure out
01:38:09.100 about these programs, we love, we'd love to help.
01:38:11.160 That'd be pretty cool if you had your own kind of like course, not that you had to teach
01:38:13.980 it all that, you know, but you were like a partner of it somehow.
01:38:16.360 Yeah.
01:38:16.620 I mean, I think we'd love to, to basically teach everybody in America how to best contribute
01:38:21.860 to the AIs, how to, how to best basically take advantage of the fact that this is going
01:38:26.700 to be one of the fastest growing industries.
01:38:28.960 There's going to be tons of opportunities.
01:38:30.240 They're going to be shaped a little bit different from, you know, the jobs that exist today,
01:38:34.080 but you know, it's not going to be that hard for everybody to learn and figure out how to
01:38:38.400 participate in it.
01:38:39.740 What are some jobs that could, that could be at risk because of AI, right?
01:38:42.760 Cause you start thinking that like, yeah, before I was talking to you, there was this general
01:38:46.540 fear of like everything could be at risk.
01:38:48.160 Right.
01:38:48.940 Um, but when you think about it, you're like, yeah, these are some jobs that, I mean, they
01:38:52.440 won't disappear, but there might be less of them.
01:38:54.220 Right.
01:38:55.300 I just think it'll be, um, it'll be, we'll be doing like a different thing.
01:38:59.880 So.
01:39:00.720 Cause a lot of our fans are probably just blue collar listeners.
01:39:02.700 Like, like there's people that work in like, you're not going to, you're still going to
01:39:06.060 need a plumber.
01:39:06.660 He's still going to an electrician.
01:39:07.700 You're still going to need anything where you have to physically do something.
01:39:10.660 You're probably still going to need.
01:39:11.740 Yeah, for sure.
01:39:12.480 And then even stuff where you're like, let's say you're, you're mostly just working on a
01:39:16.600 laptop and you know, um, even for those jobs, like it'll just change.
01:39:21.440 Like instead of being, um, instead of my job being like, Hey, I have to do the work.
01:39:25.600 I have to literally do the work on a laptop.
01:39:27.480 It'll almost be like everybody gets, um, promoted to being a manager.
01:39:31.720 Like, because I'm going to be managing like a, a little pod of 10 AI agents that are doing
01:39:39.100 the work that I used to do, but I need to make sure that all of them are doing it right.
01:39:41.920 And that, um, they're not making any mistakes and that, you know, if they're making mistakes,
01:39:46.380 I'm helping them, you know, get around those mistakes.
01:39:48.420 Like, like, it's just going to, I, the way I think about it is that like, yeah, like
01:39:52.860 literally, um, over time, everybody will just be upgraded to being a manager or sort of promoted
01:39:59.540 to being a manager.
01:40:00.280 But can you have that many managers you think?
01:40:02.540 Yeah.
01:40:02.900 Cause I think that what's the other thing that's going to happen is the, um, the economy is
01:40:07.780 just going to grow so much.
01:40:08.980 Like there's going to be, there's going to be so much, like there will be like industries
01:40:14.480 are going to pop off in crazy, crazy ways.
01:40:16.940 And so, you know, the limit is going to be how many AIs can you have?
01:40:22.400 And then you're going to be limited for, in terms of the number of AIs you have by the
01:40:26.800 number of managers that you have.
01:40:28.140 So, um, it's gonna, uh, it's gonna, cause you need air traffic controllers.
01:40:33.560 You need as many of them as you can have.
01:40:36.480 Yeah.
01:40:36.600 Well that, that definitely, but, but you're right.
01:40:39.480 But I mean, in any field, you're going to need like just more managing, more people to
01:40:43.740 oversee and make sure that these, that different things are happening because some of the smaller
01:40:48.360 tasks will just be outsourced.
01:40:50.060 Yeah.
01:40:50.300 And just so much more stuff is going to be happening.
01:40:52.340 Right.
01:40:52.740 And that's kind of right.
01:40:53.920 Because yeah, once these things are all kind of taken care of, more things can happen at
01:40:57.820 this second level.
01:40:58.880 Yep.
01:40:59.400 That's a good point.
01:41:00.120 Yeah.
01:41:00.200 You don't think about that.
01:41:00.960 Once some of the things at the first level of, of, of, of certain businesses are handled
01:41:05.620 more easily by AI, then you're going to be able to have more people operating at a higher
01:41:11.840 level.
01:41:12.340 Yeah, totally.
01:41:14.160 It's kind of like, it's kind of like always the history of technology.
01:41:16.900 Like when, when we started developing technology that, that started, um, making farming a lot
01:41:23.580 more efficient, all of a sudden, um, you know, people could do a lot of other things other
01:41:29.400 than farming.
01:41:29.840 And then, you know, all of a sudden we have big entertainment industries and big financial
01:41:34.380 industries and, you know, barbecue cook-offs, man.
01:41:36.820 I'll tell you that second, some of those guys got the weekend off, they was grilling
01:41:40.380 shit that I knew.
01:41:42.400 But yeah, so it's all about like us.
01:41:44.400 Um, yeah, everybody leveling up to be managers and then also everybody, you know, just way
01:41:49.520 more ideas are going to start happening.
01:41:50.840 Like way more ideas are going to start becoming a reality.
01:41:53.560 And so, um, it'll be, I think it'll be pretty exciting.
01:41:57.500 Like, I think it's just like a lot more stuff is going to happen.
01:41:59.580 Yeah.
01:41:59.980 What, what, what companies do you see?
01:42:01.900 Like, are there companies where, um, you're the youngest billionaire in the world ever or
01:42:07.820 no?
01:42:08.380 Is that true?
01:42:08.920 Is that a weird statement?
01:42:10.020 We can take it out if it is.
01:42:11.360 I'm not trying to talk about your money.
01:42:13.160 I mean, you, you, but you are, is that true?
01:42:17.280 Uh, according to like some publications, but I don't know.
01:42:20.680 As a young entrepreneur, right.
01:42:22.380 And you've been very successful, you know, the, uh, self-made, um, billionaire, and we
01:42:28.780 can take the word billionaire out later.
01:42:29.920 If you decide you don't want it in, um, I just don't know how certain people feel about
01:42:33.140 that.
01:42:33.720 Um, and the founder of, of, of scale AI, where do you invest?
01:42:39.460 Like, are you investing your money in certain places, like certain fields that you see continuing
01:42:43.960 to do well?
01:42:45.580 Um, so I, most of, almost all of what I'm focused on is like, how do you invest in
01:42:50.660 how do we scale?
01:42:51.400 How do we make it super successful?
01:42:52.820 But, um, and make sure that we, you know, one of the things I'm, I think is really important
01:42:57.120 is like, how do we make sure we create as much opportunity through AI as possible?
01:43:01.580 Like, how do we make sure there's as much jobs?
01:43:03.040 How do we make sure that everything that we're talking about actually is what happens?
01:43:06.680 Because I think, no, someone's gonna have to really work to make sure all these jobs
01:43:10.560 show up and all this, all this stuff actually happens the way we're talking about.
01:43:13.560 So there's gonna be new industries that are going to pop up even.
01:43:16.020 Totally.
01:43:16.620 Yeah.
01:43:16.800 I mean, I think like just in the same way that, um, you know, it's hard, nobody could
01:43:22.680 have really predicted that podcasting was going to be this huge thing and this huge cultural
01:43:27.600 movement.
01:43:28.060 Yeah, it's true.
01:43:28.940 Um, but, but it is one and it's amazing.
01:43:31.320 That's like awesome.
01:43:32.240 And, uh, and that's going to happen in like little ways and all sorts of different industries.
01:43:36.900 Um, and that's going to be, it's going to be really exciting.
01:43:39.760 Uh, what are some of the things that excite you about technology right now?
01:43:42.940 Like what are, where do you see like AI and technology in five years, 10 years?
01:43:47.500 Um, yeah.
01:43:48.580 So, um, some of the areas I think are really, uh, are really exciting.
01:43:52.160 So one is definitely everything to do with healthcare and biology that that's moving really,
01:43:58.680 really fast.
01:43:59.220 And kind of, as we were talking about, like, I, I think legitimately in our lifetimes, we
01:44:04.860 could see cancer being cured.
01:44:06.660 We could see heart disease being cured.
01:44:08.360 Like we could see some really crazy leaps and advancements, um, in that area, which
01:44:14.400 is, which is super duper exciting.
01:44:16.100 Could it create a way that we could live forever?
01:44:18.020 Do you think?
01:44:19.340 Uh, I mean, it's just a, there's definitely people working on that.
01:44:22.260 Um, you know, uh, there's, so this is getting kind of crazy and very sci-fi, but, um, some
01:44:30.160 people think that there's, there's a way for us to keep rewinding the clocks on ourselves
01:44:36.920 so that we'll always feel young and like all of our cells will actually always stay
01:44:42.680 young.
01:44:43.560 Um, it's, I think scientifically possible, but, um, and, and I think if we can get there,
01:44:49.740 that's obviously incredible.
01:44:51.700 Um, so there's people working on that.
01:44:53.320 I think that's, uh, at, at the very least, I think we'll be able to lengthen, um, our,
01:44:58.860 our lifespans pretty dramatically and, and maybe we could get to that.
01:45:02.920 Wow.
01:45:03.380 Yeah.
01:45:03.860 Yeah.
01:45:04.220 Cause I always envision this, there's like a time where it's like, okay, this group lives
01:45:07.660 forever and this group doesn't.
01:45:08.840 And there's just that parting of two different way, you know, people head not in just into
01:45:13.580 the end zone of the Lord.
01:45:14.720 And then there's other people just loiter who are going to be loitering around for a long
01:45:18.080 time.
01:45:18.840 And what that would be like that cutoff, you know?
01:45:21.420 I, yeah, it's kind of, I mean, I, I'm not that worried about it, but it, it sucks to
01:45:26.000 be that cutoff where like, maybe, maybe, yeah, maybe, yeah.
01:45:30.500 That's a weird, it's a weird thought.
01:45:32.520 Um, cause it'd also be brave.
01:45:34.020 I mean, you'd be the astronaut.
01:45:35.200 I mean, dying, you're just an astronaut really into the ether.
01:45:38.980 You don't know what's going on.
01:45:40.360 Yeah.
01:45:40.520 You know, totally.
01:45:41.240 You'd have Lewis and Clark at a Lord at that point you were out there.
01:45:44.460 Yeah.
01:45:44.740 And then if you stay, you kind of are always going to be no, you always know kind of what's
01:45:48.800 going to happen in a way.
01:45:49.680 Cause you'll be here.
01:45:50.940 Yeah.
01:45:51.180 Which would be exciting, I think.
01:45:53.480 But then after a while you might be like, dang, what happens if you die?
01:45:56.380 That's the bravest choice you could make.
01:45:58.500 That's true.
01:45:59.140 Probably.
01:45:59.800 Yeah.
01:46:00.180 How did, do you see AI having any effect on religion?
01:46:04.300 Yeah.
01:46:04.640 I think, um, I think one of the things that, uh, something I believe is like, I, I think
01:46:12.120 that as AI becomes, um, uh, you know, this is one of the things I think is really important
01:46:19.700 is that we are, are able to educate people about AI and help people understand it better
01:46:24.700 and better and better.
01:46:25.980 Because, um, I think it's this scary thing that nobody understands or people feels like
01:46:31.720 a boogeyman or, you know, feels like is, is there's just this like thing that's going
01:46:36.660 to take my job?
01:46:37.400 Like that makes it scary.
01:46:40.160 And I think that that affects people's spirituality that affects how people, you know, contextualize
01:46:46.120 themselves with the world.
01:46:47.440 You could lose your purpose.
01:46:48.220 If your purpose is a job that you feel it's going to disappear.
01:46:51.080 Yeah.
01:46:51.480 That could already be causing you to feel that way.
01:46:55.920 Yeah.
01:46:56.300 So, but I think, I think if you, if we can explain AI more and ultimately like,
01:47:01.720 um, like it is, it is a cool technology, but it's not that magical.
01:47:05.980 It's just, you know, it's like data and you crunch that data and then you get these
01:47:09.940 algorithms.
01:47:10.340 And so it's not like, um, it's not, uh, yeah, some people talk about in this crazy
01:47:17.660 way, but, but I think as long as we are able to explain what AI is and also explain what
01:47:21.820 opportunities it creates over time.
01:47:23.660 And to me, it's about getting this like relationship between humanity and AI, right?
01:47:29.920 Like, how do we make sure that this is something that enables us as humans to be more human
01:47:36.460 and us as humans do more and experience more and be better and all these things versus something
01:47:41.840 that kind of, um, is scary or will take over or anything like that.
01:47:46.360 Uh, I think that's really important.
01:47:48.580 What's a place like, so if I go to chat GBT, is that scale AI?
01:47:52.060 Is that the same thing or it's different?
01:47:53.160 That's it.
01:47:54.060 So we, so we're actually kind of under the hood powering all the different models and
01:47:59.840 AIs.
01:48:00.640 So are all the different AI like systems under the hood?
01:48:05.140 Yeah.
01:48:05.360 Yeah.
01:48:05.480 So we help power chat GBT and open AI.
01:48:08.380 We help power, uh, mostly from the data perspective.
01:48:11.260 So do we know if the answer came from your company or other companies?
01:48:14.320 If we ask it a question, like how do we, uh, there's probably no way to like literally
01:48:19.180 tell, but yeah, we help power, you know, open AIs and we help power Google's AI systems
01:48:23.840 and Meta's AI systems and, um, help power all the, all the major AI systems.
01:48:29.860 Yeah.
01:48:30.000 How can a regular person just at their home, right?
01:48:32.200 Say there's a guy who's been listening to this today.
01:48:33.740 He wants to go home today and he just wants to learn a little bit of how AI works.
01:48:37.160 He could just go on to chat GBT and ask it a couple of questions.
01:48:40.140 Yeah.
01:48:40.460 You could ask chat GBT how AI works.
01:48:43.020 Um, you could ask it how, what's the history of my town?
01:48:45.060 You know, um, can you research my last name maybe and see where it came from?
01:48:49.240 Um, what are, uh, like maybe what are some, um, innovations that could happen in the next
01:48:57.740 few years?
01:48:58.180 There's different little things you can just ask it.
01:48:59.780 That's how you can start to have a relationship with asking and learning about, uh, and you
01:49:04.160 see what it's good at, what it's not good at.
01:49:05.920 Like right now, AI is still, uh, really bad at a lot of things.
01:49:09.440 Like most things.
01:49:10.460 This is why, this is why I think when people understand it and really get a feel for it,
01:49:15.540 it stops being as scary because, uh, because I think we think about it as like, we think
01:49:20.700 about it as like the AI from the movies that are sort of like, you know, all powerful and
01:49:23.980 whatnot.
01:49:24.400 But, uh, yeah, you think of it as a robot that's going to show up and just start driving your
01:49:27.900 truck around or something.
01:49:28.840 And you're like, well, what the fuck do I do?
01:49:30.680 You know what I'm saying?
01:49:32.100 Yeah.
01:49:32.380 My keys were in my truck.
01:49:33.400 You know, I can't even get in my house now.
01:49:34.760 Like, I think that's the, there is this boogeyman fear.
01:49:38.340 Yeah.
01:49:38.960 Yeah.
01:49:39.300 But that's not the truth.
01:49:40.540 It's not the truth.
01:49:41.360 And like, um, yeah, that's not the truth.
01:49:44.280 And, and it's, to me, it's like, we have, we kind of have the choice to make sure that
01:49:49.120 also doesn't become the truth.
01:49:50.700 Right.
01:49:50.980 Like we definitely, we in the, like people building AI, but just in general, everybody
01:49:55.540 in the world has like, like we should all make sure to use it as something that like
01:50:00.860 an assistant as something that like helps us versus, um, versus think about it in like
01:50:06.120 a, in like a scary way.
01:50:07.560 Well, getting to learn and learn how to use it small ways, whatever is certainly a way
01:50:12.800 that you're going to start to realize what it is.
01:50:14.600 Yeah.
01:50:15.020 And it's easy to just sit there and say it's horrible and without trying to use it to learn
01:50:18.980 about it.
01:50:19.540 Yeah.
01:50:19.860 Sometimes I won't learn about something just so I can continue to say it's a boogeyman,
01:50:22.900 you know, cause it kind of gives me an excuse.
01:50:24.400 Yeah.
01:50:24.820 You know, if I, if I choose not to learn about it in my own life, um, chat GBT has become like
01:50:30.360 a proprietary, like, like name, like band-aids or, um, ping pong, is that okay that it's
01:50:39.140 like that?
01:50:40.120 Yeah.
01:50:40.340 I think that's, uh, that's totally fine.
01:50:41.800 I mean, I think basically like we, there will probably be more AIs over time that people
01:50:47.780 get used to and use and, um, like anything, you know, there'll always be like a, uh, in
01:50:55.260 America, there'll always be a bunch of options for consumers and much options for people to
01:50:58.820 use and they'll be good at different things.
01:51:00.700 Right.
01:51:00.980 So I think like, um, right now we're just in the very early innings of AI, but over time
01:51:06.740 we're going to have, you know, just like how, um, uh, for, for anything like for, for clothes
01:51:13.900 or for, you know, energy drinks or for whatever, like different people have different tastes
01:51:19.200 because there's going to be different things that different AIs are good at and other things
01:51:23.180 that other AIs are bad at.
01:51:24.720 Is AI right now smarter than humanity?
01:51:27.840 No.
01:51:28.800 Um, yeah.
01:51:29.880 So I think what AI is good at because it's ingested all of the, the facts, right?
01:51:37.460 Like it's ingested this, like the body of water is really, really big and it's ingested so
01:51:43.000 many different facts and information from all of humanity.
01:51:47.520 It definitely like knows more, um, or like, you know, just like how Google knows a lot
01:51:52.940 more than, than any person does.
01:51:54.660 So it definitely knows a lot more.
01:51:57.200 Um, but it's not like, you know, there's, there's a very, very, there's tons of things
01:52:02.860 that humans can do that AI is just like fundamentally incapable of doing.
01:52:06.660 So, um, so it's not a, it's not like a, I don't think you can even like measure one
01:52:12.560 versus the other.
01:52:13.260 There's sort of like very different kinds of intelligence.
01:52:15.560 Could AI just create a better AI at a certain point?
01:52:19.020 Like, could it just be like, Hey, AI, create a better AI and it could do that?
01:52:23.840 Yeah, this is a good question.
01:52:25.080 Actually, this is, this is another really, uh, hot topic in the AI industry is, is can
01:52:32.200 you get AIs to start doing some of the engineering and some of the improvement on its own?
01:52:39.380 That's scary.
01:52:40.500 Cause then it's making kind of choices.
01:52:42.320 Then it's becoming the lifeguard.
01:52:44.400 It's becoming the water and the, uh, coast guard.
01:52:48.480 Yeah.
01:52:48.860 So this is something, I mean, my personal thought is, I think this is something we should kind
01:52:53.380 of watch a little bit and we should make sure that humans always are, have the sort of like
01:52:58.240 steering wheel and the, the sort of, uh, control over.
01:53:01.240 Um, because like you're saying, you know, it's like kind of a slippery slope before that gets
01:53:06.300 kind of, uh, you know, a little, a little weird.
01:53:09.080 But, um, but I, I don't, I think that like we can, we, we can maintain that.
01:53:13.240 We can make sure that we don't let the, the AI just sort of like keep iterating and improving
01:53:17.600 on its own.
01:53:18.080 Yeah.
01:53:18.760 And in the end, you can always shut off your computer and phone and go for a walk, huh?
01:53:22.880 Yes.
01:53:23.340 Yeah, totally.
01:53:24.360 It's not like it's going to come out and just, you know, slurp you off or something if you're
01:53:27.960 trying to be straight or whatever, you know, and it's a man.
01:53:30.720 I don't even know.
01:53:31.140 Is AI male or female?
01:53:32.900 I don't think it has a, I don't think it has a gender.
01:53:35.220 Wow.
01:53:35.480 I wonder if it'll decide one day.
01:53:37.980 Like, Hey, I'm Frank, you know?
01:53:39.960 Well, there's, there are companies that try to program the AI to adopt various personas
01:53:44.600 and, um.
01:53:46.200 Oh yeah.
01:53:46.660 I got the Indian GPS guy.
01:53:48.140 Turn right.
01:53:49.540 Like, uh, like on, on meta on like Instagram or whatever, you can get, uh, you can get an
01:53:55.200 AI that has Aquafina's voice, for example.
01:53:57.080 Oh, that's cool.
01:53:57.800 Yeah.
01:53:58.640 Um, and it's funny, you know, the Aquafina AI is funny.
01:54:01.820 Um, um, you're a self-made billionaire, which is pretty fascinating.
01:54:05.620 Congratulations.
01:54:06.360 I think, you know, to have, uh, um, I think that money is energy kind of, and it kind
01:54:11.580 of flows to certain places and stuff.
01:54:13.880 And, uh, congratulations.
01:54:15.920 That's gotta be fascinating.
01:54:17.400 Was that scary when that kind of happened?
01:54:20.180 Yeah.
01:54:20.580 You just made some money.
01:54:21.700 Was that kind of a scary thing?
01:54:22.840 Did you guys grow up really well?
01:54:24.020 Like, what was that like?
01:54:25.260 No, no, no.
01:54:25.860 I grew up, I think like solidly middle class.
01:54:28.660 Um, like we weren't, we, you know, we weren't, uh, we weren't like, uh.
01:54:34.280 You weren't trapping or anything.
01:54:35.560 You guys, yeah, yeah, yeah.
01:54:36.620 But, but like, you know, solidly middle class.
01:54:38.600 And I think like, uh, I mean, one of the, one of the, um, one of the crazy things about
01:54:44.900 tech and the tech industry is that, you know, we've been, we've been just sort of like building
01:54:50.360 this thing up over time.
01:54:51.480 But one of the things that, that happened is AI all of a sudden became like, there was
01:54:57.140 so much progress in AI and it became the biggest thing in the world.
01:55:00.500 Right.
01:55:00.880 Like, I mean, um, all of a sudden, you know, anywhere I go, I'm, everybody is talking about
01:55:07.460 AI.
01:55:07.860 It didn't used to be like that when I started the company.
01:55:09.700 Um, uh, AI was just kind of a, like a niche topic.
01:55:14.020 And now it's like, you know, anywhere you go, like I'll just be walking around and I hear
01:55:19.120 like random conversations about chat GPT and AI and robots and all this stuff.
01:55:23.820 And, uh, and so it's kind of been crazy to be experienced that and be a part of that
01:55:29.700 wave and kind of like, you know, I started working on this company almost nine years ago.
01:55:35.500 So it was like, when I started working on it, it was kind of this obscure thing.
01:55:39.240 And, you know, I always knew that it was going to become bigger, but I, I could have never
01:55:42.780 predicted what was going to happen to AI.
01:55:44.440 Yeah.
01:55:44.920 Yeah.
01:55:45.120 It's fascinating.
01:55:45.660 It's almost like you were just standing on like the bingo number.
01:55:49.260 They got called kind of by time.
01:55:52.160 Yeah.
01:55:52.220 Um, and it's surreal.
01:55:54.020 Oh, I can real when that happens.
01:55:55.680 Oh, yeah.
01:55:56.160 I can only imagine that.
01:55:57.640 Um, are your parents pretty proud of you?
01:56:00.180 What's that like?
01:56:01.980 Yeah.
01:56:02.400 So at first they were like, you know, at first I dropped out of college, right?
01:56:06.300 Oh yeah.
01:56:06.840 That's true.
01:56:07.200 Yeah.
01:56:07.600 And in Asian culture, that's like not a thing you do, right?
01:56:10.080 Not good.
01:56:10.360 Yeah.
01:56:10.720 Yeah.
01:56:10.900 That is like the opposite of being Asian.
01:56:12.820 Well, well, my parents both have PhDs.
01:56:15.560 Um, my two brother, I have two older brothers.
01:56:17.600 They both have PhDs.
01:56:18.960 Uh, like everybody in my family has, they've gone through all of schooling.
01:56:24.680 Yeah.
01:56:24.860 They're like, Alex is not doing good.
01:56:27.180 No, that might've been Spanish.
01:56:28.600 But they're like, yeah, Alex is not doing good.
01:56:30.540 Yeah.
01:56:30.960 So they were, they were pretty worried at first.
01:56:32.700 And I kind of, um, I told them a little bit of a white lie that I was like, oh no, I'm,
01:56:39.200 I'm just going, I'm going to go back.
01:56:40.660 You know, I'm going to finish, I'm going to get this, uh, like tech thing out of my
01:56:45.400 system and they'll go back and I'll finish school.
01:56:48.140 Um, obviously that hasn't happened yet, but, uh, but yeah, they were worried at first.
01:56:52.260 Now that now they're super proud.
01:56:53.600 Yeah.
01:56:54.000 They're super duper proud.
01:56:55.060 And I, uh, yeah.
01:56:56.720 And I, I owe everything to my parents, you know, they're, uh, they're awesome.
01:56:59.620 And like, seriously, they're, they're, my parents are super brainy.
01:57:03.780 They're, uh, they're both physicists.
01:57:05.680 They would like teach me about physics growing up and teach me about math.
01:57:08.900 And, uh, that's what led me, uh, be so good at the competitions, you know?
01:57:13.400 That's cool.
01:57:14.700 Yeah.
01:57:15.060 I don't even, I think do, um, are your parents or your grandparents from China or no?
01:57:21.000 My grandparents are from China.
01:57:22.180 Yeah.
01:57:22.680 Did your, does your family have a lot of Chinese culture?
01:57:25.720 So this is kind of interesting.
01:57:27.180 This is true for a lot of, um, Chinese Americans is that there's kind of like, there's, uh,
01:57:34.220 Chinese culture.
01:57:35.440 And then that's kind of almost like, it's very different from the Chinese communist party
01:57:41.660 and the current government.
01:57:43.180 Because basically one way to think about China is, I mean, China has been, is a, is a culture
01:57:48.240 and a civilization that's been around for like thousands and thousands of years.
01:57:52.400 Right.
01:57:52.820 So it's, there's a very longstanding culture.
01:57:56.000 Oh yeah.
01:57:57.000 But that's very different from the current communist party and the current communist regime.
01:58:03.160 Oh, for sure.
01:58:03.640 I think most people probably think of, I mean, I don't know.
01:58:06.680 It's funny.
01:58:07.180 I never thought about that.
01:58:07.900 I definitely think about him as different.
01:58:09.180 I definitely don't think if I see a Chinese person, I don't think, oh, that's a communist
01:58:12.040 person.
01:58:12.600 Yeah, exactly.
01:58:13.180 I mean, that would be, yeah, that's crazy.
01:58:14.700 I think that's crazy.
01:58:15.520 If some people thought that I just, yeah.
01:58:18.680 I think somebody has a crazy long history, like, damn.
01:58:22.460 And then maybe almost a, do you feel like there's some people in certain parts of China
01:58:25.920 are almost like a captured people then?
01:58:27.580 Is it, do you think that's a?
01:58:29.120 Yeah.
01:58:29.440 I mean, the, the Uyghurs that we're talking about, I mean, that's just like horrible what's
01:58:33.040 happening to them.
01:58:33.640 I didn't even know about that, man.
01:58:34.720 Thanks for putting me up on that Uyghur game.
01:58:36.580 Yeah.
01:58:37.120 Yeah.
01:58:37.320 No, it's, uh, I mean, some of the stuff, like, I mean, we don't really like, first
01:58:41.420 of all, the world is a, is a huge place and there's like all sorts of both great and bad
01:58:48.140 things happening all over the place.
01:58:49.700 Yeah.
01:58:50.100 But, um, but yeah, no, I think that like, I think a big part of this is like, we want
01:58:54.600 to make sure we have, we have governments that, uh, that believe in democracy, believe
01:58:59.180 in liberty, these kinds of things.
01:59:00.660 Yeah.
01:59:02.240 With being somebody that's able to be smart and conceptualize stuff, do you start to get
01:59:07.140 in, do you have any insight?
01:59:08.460 This might be the last question I have for you.
01:59:10.100 Um, do you have any insight on like the afterlife or what happens?
01:59:14.800 Like I never really thought about it for like talking to like a real math guy about that.
01:59:21.360 Yeah.
01:59:21.740 I think, um, like what's the total, like what's the sum zero game or whatever, you know, what's
01:59:25.980 the, yeah, yeah.
01:59:27.520 Yeah.
01:59:27.880 I guess like the way I've always thought about it because, um, partially because both my parents
01:59:33.480 were physicists was I, I kind of always feel like people live on with their ideas and they're
01:59:41.220 kind of like, like what are the things that, the things they put out into the world.
01:59:45.000 That's kind of how you live on.
01:59:46.740 Cause like in, in math, like everything you learn about is like a theorem named after a
01:59:52.380 different person or, uh, or sort of idea named after some, you know, the first mathematician
01:59:58.660 or the first scientist or whatever to figure that out.
02:00:02.400 Um, like Einstein lives on because bagels, that was a shitty joke, but thank you.
02:00:08.740 Um, oh yeah.
02:00:10.060 Because, you know, because the, yeah, all E equals MC squared, all that kind of stuff.
02:00:14.960 Yeah.
02:00:15.480 So, so like, I know what that kind of stuff is.
02:00:17.860 Jesus.
02:00:18.280 Sorry, but it's, sorry, pretending.
02:00:20.040 No, no, no, exactly.
02:00:21.280 E equals MC squared.
02:00:22.000 That's Einstein.
02:00:22.600 And so, so I think I always feel like people, um, yeah, you live on by your ideas and, and
02:00:29.660 kind of what you put out into the world.
02:00:31.360 Um, and that's kind of always how I've thought about it.
02:00:33.800 Um, so you don't get some deeper thought about like, like since your brain is able to
02:00:38.140 be, cause you have probably a unique brain, right?
02:00:40.760 And more, I mean, you know, and everybody has a unique brain, but I've just never asked
02:00:45.980 somebody with your, I've never asked your brain this question.
02:00:48.660 Yeah.
02:00:49.240 Do you, do you get some further insight about like what you think happens when you die?
02:00:54.300 You know?
02:00:55.020 Well, I think that one of the things that gets talked about a lot in Silicon Valley, um, where
02:01:00.480 I live, especially is, uh, like the simulation, whether it's all simulation and whether like,
02:01:06.600 uh, I don't, do you watch Rick and Morty?
02:01:08.900 I don't watch it, but I'll start.
02:01:10.800 There's a, there's, there's some, there's some episodes where it gets into this and I
02:01:14.920 think it, it covers it in a, in a pretty good way.
02:01:17.000 But like, you know, what if every, what if all of humanity is just like a, like a, almost
02:01:23.020 like a, an experiment or a video game that, that some other civilization is running?
02:01:27.960 Yeah.
02:01:28.500 That's kind of the one that, uh, that fucks with particularly like people's mind and tech
02:01:33.700 a lot because we're like every day, all day, every day, we're out there trying to make
02:01:38.860 computers and make simulations and make things that are, that are like more and more sophisticated
02:01:44.580 and advanced and capable.
02:01:46.140 And so kind of the mind fuck is like, oh, what if everything we know is, is just kind
02:01:51.620 of the, you know, a simulation from some other civilization.
02:01:55.540 Or if we advanced it enough that we're able to make this happen and seem real.
02:01:59.440 Yeah, exactly.
02:02:00.380 Yeah, exactly.
02:02:01.200 So, you know, I think, I think one of the things that, that, um, like with AI and with a bunch
02:02:06.920 of other things, like in, well, even just in the past, like 30, 40 years, video games
02:02:12.320 have gotten so much more realistic, right?
02:02:15.380 Crazy realistic.
02:02:16.680 Yeah.
02:02:17.300 So it was, we've seen that happen like in a hundred years, like, would we be able to
02:02:23.340 simulate something that feels pretty realistic?
02:02:26.260 I mean, probably.
02:02:27.520 Right.
02:02:27.720 Could we be avatars?
02:02:29.060 Yeah.
02:02:29.620 For some, for some other thing.
02:02:31.700 Yeah, exactly.
02:02:33.020 Damn, dude.
02:02:34.340 I'll say this.
02:02:34.900 My avatar is a pervert, brother.
02:02:38.020 Um, have you met any guys like, uh, Jinssen Hong?
02:02:42.100 Is that his name?
02:02:42.760 Yeah, yeah.
02:02:43.380 You have?
02:02:44.020 Yeah, totally.
02:02:44.760 Wow.
02:02:45.120 What is it like when you meet some of these guys?
02:02:46.380 Have you met Elon yet?
02:02:47.560 Met Elon?
02:02:48.160 Yeah.
02:02:48.560 Met Elon a few years ago.
02:02:49.680 Yeah.
02:02:50.200 Wow.
02:02:50.540 Got to know him.
02:02:51.340 He's the fucking, he's the, he's turning into him.
02:02:53.120 Jinssen's the goat.
02:02:54.140 Yeah.
02:02:54.380 He is?
02:02:55.440 He is.
02:02:55.820 Yeah.
02:02:56.000 He always wears this leather.
02:02:57.680 You see that leather jacket?
02:02:58.740 Yeah.
02:02:59.420 He, uh, we hosted this dinner.
02:03:01.860 And Hong, is that a Chinese name?
02:03:03.140 Hong?
02:03:03.480 Hong.
02:03:03.800 Yeah.
02:03:04.100 He's Taiwanese, I think.
02:03:05.400 Yeah.
02:03:06.080 Um, or Taiwanese American.
02:03:08.000 But, uh, he, uh, he in, in 2018.
02:03:12.620 So when we were like a baby company, uh, we threw this dinner in, uh, in Silicon Valley.
02:03:19.500 And, uh, and we just kind of, I kind of YOLO invited him, um, this was, you know, years and
02:03:26.480 years ago.
02:03:27.300 And, uh, and I didn't expect it, but he said yes.
02:03:29.560 And he came to our dinner and, uh, he, he came, uh, he came, it was, it was like at this
02:03:35.760 restaurant in Silicon Valley.
02:03:37.120 And, uh, he came and, uh, he came and, uh, he, uh, he, uh, he, uh, he went to boarding school.
02:03:45.240 I think he's probably told this story, but like he went to boarding school, um, and his
02:03:49.880 parents, when they came to America, they, they, they wanted to send him to boarding school,
02:03:53.600 but they didn't.
02:03:55.440 Um, and so they just sent him to like the first boarding school that they, they like
02:03:58.540 found on Google or something or like found, it wasn't even Google at the time.
02:04:01.800 So that, that they like heard about, and that boarding school happened to be, um, kind of
02:04:06.700 like a rehab boarding school.
02:04:08.440 So it was like, he was like this.
02:04:12.300 So fucking, it's a halfway house.
02:04:14.120 He's just sitting there.
02:04:15.380 He's there learning with people who are, um, detoxing.
02:04:18.900 Yeah.
02:04:19.060 So he, he told me the story about, he told the story about how he was like, he was just
02:04:22.760 like this, this kid, um, at, you know, this boarding school that were like, everybody else
02:04:28.520 had these, like all these crazy backgrounds.
02:04:30.280 And he like, he got by and made, made his way through that school by like, uh, by like
02:04:37.320 doing everyone's math homework and like, you know, kind of like wheeling and dealing that
02:04:41.040 way.
02:04:41.440 Oh yeah.
02:04:42.040 And you could see that he, he learned how to like wheel and deal and sell and all this
02:04:47.360 stuff from all the way from way back then.
02:04:50.160 Cause he was, he was, I mean, his, his, his, his story is pretty crazy.
02:04:53.840 Really?
02:04:54.180 But, uh, yeah.
02:04:54.700 We have to research him a little bit.
02:04:55.800 Maybe we would get to talk with him one day.
02:04:57.480 Yeah.
02:04:58.120 Jensen, Jensen's awesome.
02:05:00.060 Um, you know, all these people in tech, they're like, uh, I mean, they're all real people,
02:05:03.920 but they all have the craziest backgrounds.
02:05:06.080 Yeah.
02:05:06.380 Yeah, man.
02:05:06.700 It's just so funny.
02:05:07.360 Whenever I met you, I just didn't know.
02:05:08.960 I figured that since you were with Sam Altman, that you were probably a tech guy, you know?
02:05:13.760 And, um, but yeah, I didn't know.
02:05:17.980 Uh, I think maybe somebody said he's in the AI verse, you know, but you just seem like
02:05:21.480 such a totally normal, like I would not have thought that, um, you were just, I don't know.
02:05:26.960 I guess sometimes you think like somebody's going to be, they're going to be like super
02:05:30.160 quiet or, you know, not have a lot of different thoughts, but yeah, it was cool, man.
02:05:35.140 We had a good time.
02:05:35.720 I'm glad we got to link up.
02:05:37.920 Yeah, totally.
02:05:38.900 That was cool, bro.
02:05:39.760 You're probably like my, you might even be my first Chinese friend.
02:05:42.080 I don't think second, probably Bobby Lee, who's denying it, but he'll come around.
02:05:49.760 Um, Alexander Wang, man.
02:05:51.320 Thank you so much, dude.
02:05:52.620 I bet your, uh, whole family's super proud of you.
02:05:55.080 Um, yeah, it's exciting, man.
02:05:56.320 Thank you for coming to spend the time with us and just helping us, uh, learn and think.
02:06:00.820 No, thank you.
02:06:01.740 This was awesome.
02:06:02.360 And I think like, I mean, we were talking about this before, but, um, I don't want to
02:06:06.140 make sure that like people all around the world, especially Americans aren't scared of
02:06:10.680 AI because it's going to be, it's going to be really cool and it's going to be amazing,
02:06:14.440 but, um, we need to remove the boogeyman component and, uh, and thanks for helping me do that.
02:06:20.880 Yeah.
02:06:21.300 No, man.
02:06:21.640 I think I definitely feel differently about it.
02:06:23.740 I feel like it's a tool that I can use.
02:06:25.580 Right.
02:06:25.920 And even, I don't know how to use it.
02:06:27.680 So I'm trying to figure out, you know, um, one more question.
02:06:30.820 How do you keep it from becoming like a, like a advertisement trap house?
02:06:34.480 Like the internet's become like the internet's just pop-ups and ads and fucking best buy trying
02:06:39.060 to like beat you over the head on some shit.
02:06:40.680 Like, how do you stop that?
02:06:42.780 How do you keep that out of like you guys as waters or do you have to go there at some
02:06:47.420 point to make money?
02:06:48.280 Yeah.
02:06:49.440 First, I'm hoping that, that, um, we have like the AI industry as a whole avoids advertising
02:06:55.580 as much as possible because, um, because it's kind of, it's very different.
02:07:00.260 Like it is a tool that, that people can use and people can use to start businesses or make
02:07:05.900 movies or make all these like different ideas happen.
02:07:08.360 And I, I would much rather it be a tool that doesn't get sort of, um, uh, that doesn't become,
02:07:14.900 yeah, like an advertising thing versus like, I wanted to make sure it's a tool that helps
02:07:19.620 people first and foremost.
02:07:21.320 Um, so that's that, I, I think this is, there's kind of like a, a choice here and, uh, and we
02:07:27.640 as an AI industry just got to, I think, make some of the right choices.
02:07:30.540 Yeah.
02:07:30.960 I think there would be value in staying as pure as you could, if you could find a way to,
02:07:34.580 you know, if there's other money to be made on the side, it almost seems sometimes like
02:07:38.000 it could be a trap, you know?
02:07:39.440 Yeah.
02:07:40.140 Yeah.
02:07:40.380 And I think that's, that's like, uh, you know, I, I want to make sure that, uh, that
02:07:46.180 people don't feel like they're being used by the AIs.
02:07:48.180 I think that'd be a really, that'd be really bad if we ended up there.
02:07:50.980 So we, you know, and I, that, I don't think we need to make it like that at all.
02:07:54.940 Like, I think we can, we can make sure the AI is helping you do things is, is super helpful.
02:07:59.680 It's super, um, is like a thought partner.
02:08:02.520 Um, is it, is like an, like an assistant.
02:08:04.760 Like those are things that I think we want to make sure AI stays.
02:08:07.580 Gang, baby.
02:08:08.660 Wang gang.
02:08:09.500 Gang, Alexander Wang, man.
02:08:10.780 Thanks so much, bro.
02:08:12.000 Yeah.
02:08:12.240 Thanks so much for having me.
02:08:13.200 Yeah.
02:08:13.360 It was awesome, man.
02:08:14.360 Danny Kane.
02:08:15.040 Shout out to Danny, um, who came up and Danny lives in Franklin, right?
02:08:19.180 We got to get to spend some time with him.
02:08:20.520 Yeah.
02:08:20.780 Lives in Franklin.
02:08:21.640 Whenever you're out there.
02:08:22.540 And shout out to, uh, Alex Brusowicz, who, uh, who we met through.
02:08:26.640 Yeah.
02:08:27.120 Who else on your team's here, man, today?
02:08:29.120 Uh, we have Joe Osborne.
02:08:30.960 Yeah.
02:08:31.740 And, uh, and we have Danny's whole crew.
02:08:34.740 So Arian and, uh, Clayton.
02:08:37.800 Yeah.
02:08:38.120 Nice.
02:08:38.460 We'll have to get a group picture.
02:08:39.780 We'll put it up at the end.
02:08:40.720 Um, thank you guys.
02:08:41.840 Uh, have a good one, man.
02:08:42.920 Now I'm just floating on the breeze.
02:08:45.100 And I feel I'm falling like these leaves.
02:08:48.080 I must be cornerstone.
02:08:53.520 Oh, but when I reach that ground, I'll share this peace of mind.
02:08:58.040 I found I can feel it in my bones.
02:09:02.200 But it's gonna take...
02:09:04.580 But it's gonna take...
02:09:05.380 Oh, but there is a way...
02:09:07.280 Yeah.
02:09:12.000 Oh.
02:09:14.760 Yeah.
02:09:16.180 Oh, no.
02:09:16.560 Yeah.
02:09:16.840 What?
02:09:17.800 But what if I can come to mimics for such a great söndrom,
02:09:21.780 you can,
02:09:23.240 and say,
02:09:25.760 Oh, so.
02:09:26.220 revelry by.