This Past Weekend with Theo Von - July 23, 2025


#599 - Sam Altman


Episode Stats

Length

1 hour and 33 minutes

Words per Minute

204.36652

Word Count

19,130

Sentence Count

1,441

Misogynist Sentences

5

Hate Speech Sentences

10


Summary

Sam Altman is a straight-up tech lord. He s one of the leaders in the development of artificial intelligence and founded OpenAI, a company best known for developing chatbot, chatPGP. In this episode, we chat about the pros and cons of AI, the fears and hopes, and where we re headed.


Transcript

00:00:00.300 Today's guest is, well, dude's a straight-up tech lord, let's be honest.
00:00:05.000 He's one of the leaders, the world leaders, in the development of AI.
00:00:11.100 He started OpenAI, which is known for having chat GPT.
00:00:17.700 We had a fascinating chat about the pros and cons, the fears and hopes,
00:00:22.920 everything I could learn about artificial intelligence and where we're headed.
00:00:28.280 TBD, baby.
00:00:29.140 Today's guest is Mr. Sam Altman, and I'm very thankful for his time.
00:00:43.140 You know, we had a residential architect do this office.
00:00:53.260 We wanted it to feel like someone's, like, really comfortable, like, country house or something like that.
00:00:57.320 Yeah.
00:00:57.800 Not, like, the big corporate, like, sci-fi castle.
00:01:00.720 Yeah, that's what I was, I was, like, a little bit like, oh, is it going to be, you know, will there be a drawer bridge?
00:01:05.220 Will we be uploaded into a suite?
00:01:07.580 Like, what will happen to us?
00:01:08.740 Yeah, we don't want that.
00:01:09.500 Yeah.
00:01:09.640 We're going for, like, residential.
00:01:11.080 Yeah.
00:01:11.360 I was, like, how do we even get through the firewall?
00:01:13.460 How many, like, hit points will we need to get through?
00:01:15.580 You know, it got very Dungeons and Dragons in some of my, like, imagination sometimes.
00:01:21.720 Yeah, we want people to feel, like, super comfortable and try to get pretty far in that direction.
00:01:25.700 It feels like it.
00:01:26.660 Your staff's very sweet, nice people.
00:01:30.020 You have, thanks for hanging out, man.
00:01:31.900 Absolutely.
00:01:32.400 Thanks for having me to do this.
00:01:32.640 I definitely appreciate it.
00:01:34.620 Yeah, I haven't seen you since I fell out of my chair.
00:01:36.160 You fell out of the chair at the inauguration.
00:01:37.820 That was really, like, quite a way to meet you.
00:01:39.620 Yeah.
00:01:40.060 I felt so embarrassed.
00:01:41.200 And you were one of the faces that I looked up and saw, and I was like, God.
00:01:44.000 And that was my first moment, like, AI built us a better chair, to be honest with you.
00:01:48.100 And you did nothing, right?
00:01:49.260 You were just sitting there, and it just collapsed.
00:01:51.340 Nothing.
00:01:51.740 I remember that.
00:01:52.800 And it was just so embarrassing.
00:01:54.120 I was like, oh, of all people, me, and here I am in this place.
00:01:57.680 I think it was perfect because everybody's got to have some store.
00:02:00.460 And people are like, oh, it was the inauguration.
00:02:01.600 And, like, everybody's got to have some story to tell.
00:02:03.540 Yeah.
00:02:04.140 And that was an incredible story for us all to tell.
00:02:06.140 That's a good point.
00:02:07.280 I do remember looking at people for help, though.
00:02:09.720 And oddly, your eyes, I was like, oh, my God.
00:02:12.600 He could help.
00:02:13.420 You did look like a beacon of help in the distance.
00:02:15.040 I tried to help.
00:02:17.260 You have a baby.
00:02:18.140 You have a new child?
00:02:20.460 It is.
00:02:24.200 There have been, like, a lot of experiences in life where everyone tells you something's going to be great.
00:02:27.960 And then it's like, okay, the people are right.
00:02:29.920 The consensus is right.
00:02:30.800 It's, like, even better than I thought it was going to be.
00:02:32.980 But this has been the strongest example of that ever.
00:02:35.480 Like, I knew it was going to be great.
00:02:37.240 And it's, like, way, way better.
00:02:38.700 It's impossible to describe.
00:02:39.900 There's nothing I can say that's not, like, very cliche.
00:02:42.520 And it's totally amazing.
00:02:44.260 What is, like, one of your – and it's a – you have a young boy?
00:02:46.940 Yeah.
00:02:47.060 And what's something, like, that you think is, like, need or, like, what's one thing that kind of, like, is bringing you joy with it?
00:02:55.480 Watching the speed with which he, like, learns new things or gains new capabilities is just unbelievable.
00:03:03.120 It's, like, every day.
00:03:03.980 It's, like, oh, man.
00:03:05.160 He just couldn't do that before.
00:03:06.200 And now he's, like, grabbing stuff and passing it between his hands.
00:03:08.280 And getting to, like, watch it day to day is just an amazing rate of change.
00:03:14.920 And then I don't, like – again, I realize it's, like – you know, I realize that, like, everything about babies are very finely tuned over a long period of evolution to make us, like, love them and be fascinated by them.
00:03:27.600 And it's, like, a neurochemical hack.
00:03:29.200 But I love it.
00:03:29.960 It's great.
00:03:30.460 It's so strong.
00:03:31.160 It's so intense.
00:03:32.340 So it's really, like, almost like a coffee for your heart or something kind of?
00:03:35.020 I don't even know how to find – I've tried to, like, come up with an analogy to tell – because now I'm, like, telling everybody, you've got to have a lot of kids.
00:03:41.780 It's really important.
00:03:42.460 Yeah.
00:03:43.420 And I've been looking for an analogy of what to explain.
00:03:45.920 And then I always just say, like, I don't know how to explain this.
00:03:48.440 It's just – it is the best thing I've ever done by far.
00:03:51.020 I feel like a completely changed person.
00:03:54.940 And I was, like, thinking the other day, like, there used to be all these other – like, at this point, all I do is work and hang out with my family.
00:04:02.140 I was, like, I don't – I don't, like, really get to do a lot of hobbies anymore.
00:04:06.020 Right.
00:04:06.280 It's a busy time at work.
00:04:07.560 I don't get to hang out with my friends that much.
00:04:10.940 And I don't – you know, there were, like, all these things where people tell you, like, oh, you've got a baby coming.
00:04:14.620 You've got to go, you know, take that spontaneous international trip because you're not going to be doing that again for a long time.
00:04:19.160 And I was, like, oh, that is kind of sad.
00:04:22.160 In practice, you don't do it that often.
00:04:23.920 I at least didn't do it that often.
00:04:25.060 And I don't miss it at all.
00:04:26.340 I, like, remember that that used to be a possibility.
00:04:28.820 Now I can see that's not going to be a possibility for a long time.
00:04:30.840 And I'm thrilled with the trade.
00:04:32.140 You're moved on.
00:04:32.960 I'm so happy.
00:04:34.180 How old is your child?
00:04:35.560 Four months.
00:04:36.440 Oh, that's a funny – like, at five or six months, they start to get, like, fun and you can, like – they're still, like, they can't go anywhere, you know?
00:04:42.940 But they're, like, intrigued and stuff.
00:04:45.300 They start to, like, smile or process more.
00:04:47.500 I don't know how you guys say it.
00:04:48.880 But –
00:04:49.200 Yeah, he's totally, like, turned on now.
00:04:50.880 Yeah.
00:04:51.060 Like, really aware, understands things.
00:04:52.400 It's super cool.
00:04:53.320 I have a thought sometimes that this will be one of the last, like, maybe 40 years that we conceive children in the body.
00:05:00.760 Did you have any thoughts about that?
00:05:02.200 I've definitely heard a lot of people say that.
00:05:06.120 Like, I haven't thought about it hard myself.
00:05:10.960 But, yeah, I guess it does make sense.
00:05:12.780 Like, I guess that does make sense.
00:05:15.940 Like, God, you were in your mom's – that's crazy, you know?
00:05:18.460 It is crazy.
00:05:18.940 You pervert or whatever.
00:05:20.120 Like, I think in the future, people will be – it will be kind of done, like, in a –
00:05:23.600 In a vat or something.
00:05:24.440 Yes.
00:05:24.940 In, like, a nice vat.
00:05:25.960 You can go see it on the weekends or whatever and, like –
00:05:27.780 Doesn't that just feel, like, off to you?
00:05:30.380 Like, I can totally intellectually, like, understand that that may be the better way to do it.
00:05:34.700 Oh, yeah, it feels way off to me.
00:05:35.860 I was trying to – I thought you would like it.
00:05:38.640 You know?
00:05:39.080 I thought –
00:05:40.020 I mean –
00:05:40.820 Like, or I thought that would be, like, a thought.
00:05:42.780 Like, I guess, like, that – for me, that's one of, like, my futuristic thoughts, you know?
00:05:46.520 Like, I can totally accept that that will be what everybody does and that it's, you know, easier and we can, like, make it healthier for the child and the mother.
00:05:56.000 Yes.
00:05:56.020 You know, the mother doesn't take the health risk.
00:05:57.680 But, man.
00:05:59.940 So, intellectually, I can say that and then, like, emotionally, it feels like something is off of me.
00:06:05.620 It feels off.
00:06:06.100 Yeah.
00:06:06.680 Oh, yeah.
00:06:07.720 Yeah, because then the family, like, on the weekends, the parents would come and, like, tinker on the glass or whatever.
00:06:12.380 Or their dad would put, like, a, you know, like a Go Falcon sticker on the thing.
00:06:17.220 You know what I'm saying?
00:06:17.820 People would, like, decorate it all up or write little messages on there.
00:06:22.280 You know, I think there's another – like, another take I have on all of this is that in this world that we're heading to of, like, crazy sci-fi technology becoming reality,
00:06:30.920 the sort of, like, the deeply human things will become the most precious, sacred, valued things.
00:06:36.040 And that we'll really care about, like, the human experience more than ever.
00:06:40.800 And maybe it won't go that way.
00:06:41.700 I don't know.
00:06:42.600 Yeah.
00:06:43.720 Do you – no, and that's some of the stuff we want to talk about.
00:06:46.220 And thanks so much, man, for sitting down.
00:06:48.900 Do you think your child will go to college?
00:06:51.440 Do you think, like – what do you kind of think that looks like?
00:06:54.220 Probably not, if I had to guess.
00:06:56.480 Like, I think – well, I only went to half of college.
00:06:59.040 Did you drop out?
00:07:00.300 Yeah.
00:07:00.960 Dude, you guys all – I freaking dropped out.
00:07:02.920 I didn't get shit.
00:07:04.520 You dropped out.
00:07:05.500 Wang dropped out.
00:07:06.820 Zuckerberg dropped out.
00:07:08.860 Probably a lot of other people.
00:07:09.720 And you.
00:07:10.560 Yeah.
00:07:11.200 Okay.
00:07:11.520 Yeah.
00:07:11.920 Well, hey, we're both here, so.
00:07:13.680 Oh, it worked out fine.
00:07:14.660 Yeah, you're right.
00:07:15.220 You know?
00:07:15.720 You're right.
00:07:16.100 Never mind.
00:07:16.540 I'm sorry.
00:07:16.820 I'm being self-defeating.
00:07:19.560 Yeah, what does that look like when you think about that?
00:07:21.500 Like, yeah, with AI, with so much new information coming online, right, and so much, like, data being collected
00:07:26.880 and, like, information being carpooled and maybe which is a term.
00:07:33.120 So, you and I never grew up in a world that didn't have computers.
00:07:36.880 Right.
00:07:37.060 Like, and our parents were like, oh, this – there weren't computers, and then there
00:07:40.920 were, and it was this big, crazy adjustment.
00:07:42.500 It took them a long time to figure it out.
00:07:43.580 But to us, like, computers just always existed.
00:07:46.920 They were just – I mean, maybe they were kind of new, but they were always around.
00:07:50.780 And then, like, you know, a kid that is, like – there was this video on YouTube I saw,
00:07:56.220 like, maybe 12 years ago, something like that, 14 years ago, that has really stuck with me.
00:08:01.940 It was, like, a little baby in a dentist's waiting room or something picking up one of
00:08:05.380 those old glossy magazines and going like this.
00:08:07.880 Oh, I remember that.
00:08:08.920 And to that kid, it was just, like, a broken iPad because that kid had just, like, grown
00:08:12.900 up in a world where, like, there were touchscreens everywhere.
00:08:16.940 And my kid will never grow up – will never, ever be smarter than an AI.
00:08:23.680 That will never happen.
00:08:24.840 You know, a kid born a few years ago, they had a brief period of time.
00:08:26.840 My kid never will be smarter.
00:08:27.880 But also, they'll never – they'll never know a world where, like, products and services
00:08:34.420 aren't way smarter than them and super capable.
00:08:38.380 They can just do whatever you need.
00:08:39.500 And in that world, I think education is going to feel very different.
00:08:42.560 I already think college is, like, maybe not working great for most people.
00:08:47.080 But, yeah, I think fast forward 18 years, it's going to look like a very, very different thing.
00:08:51.380 Yeah.
00:08:52.100 Yeah, do you think there will – oh, here's that video right here of this kid.
00:08:55.260 Yeah, yeah, yeah.
00:08:56.040 All right.
00:08:56.440 I was wrong about the dentist.
00:08:57.300 It was – or maybe there's a few of these.
00:08:59.000 He's like, somebody charge this magazine.
00:09:00.740 He's yelling.
00:09:01.500 How would you recommend to a parent right now to prepare their children for, like, an AI future?
00:09:06.000 Kind of, like, are there certain curtails that you would start to put in now?
00:09:09.300 Are there certain, like, you know, adjustments where you, like, get them in a certain training
00:09:14.200 or have them start to watch certain models of things online?
00:09:16.780 Like, what is that – you know?
00:09:18.640 I actually think the kids will be fine.
00:09:20.360 I'm worried about the parents.
00:09:21.680 If you look at the history of the world here when there's new technology, like, people that grow up with it,
00:09:26.180 they're always fluent.
00:09:27.320 They always figure out what to do.
00:09:28.420 They always learn the new kind of jobs.
00:09:30.440 But if you're, like, a 50-year-old and you have to, like, kind of learn to do things in a very different way,
00:09:36.440 that doesn't always work.
00:09:37.900 Yeah.
00:09:38.000 So I think the kids are going to be fine.
00:09:40.280 I mean, I do have worry – like, I do have worries about kids and technology.
00:09:44.400 Like, I think this scrolling, the kind of, like, you know, short video feed dopamine hit,
00:09:50.640 it feels like it's probably messing with kids' brain development in a super deep way.
00:09:54.300 So it's not that I have no worries.
00:09:55.540 I have, like, extremely deep worries about what technology is doing to kids.
00:09:58.360 But in terms of kids' ability to, like, be prepared for the future and use a new technology,
00:10:03.180 they seem really good at that.
00:10:04.340 Yeah.
00:10:04.600 Always through history.
00:10:05.260 That's a good point, actually.
00:10:06.640 Yeah.
00:10:06.800 It's like if you just grow up with it, it's just like having – it's just totally normal.
00:10:10.320 It's like having kneecaps or whatever.
00:10:11.700 You're just kind of used to it.
00:10:13.440 You can't imagine the world where it doesn't exist.
00:10:16.020 You just –
00:10:16.460 Right.
00:10:16.720 Yeah.
00:10:17.300 Yeah.
00:10:17.780 That's a good point.
00:10:18.880 I remember when I was in school, in, like, junior high, and Google first came out.
00:10:24.400 Mm-hmm.
00:10:25.080 And all the teachers, like, freaked out.
00:10:27.280 And they're like, this is the end of education.
00:10:29.220 You know, if you – why do you have to memorize history, facts in history class,
00:10:33.140 if you could just look them up instantly on the internet?
00:10:34.960 You don't even have to learn to go to the library.
00:10:37.080 And the answer is, like, yeah, maybe memorization is less important.
00:10:39.780 But with these new tools, you can think better, come up with new ideas, do new stuff.
00:10:43.740 I'm sure the same thing happened with the calculator before.
00:10:45.920 Yeah.
00:10:46.100 And, you know, now this is, like, this is just a new tool that exists in the tool chain.
00:10:50.440 And what about, like, say if there is somebody, though, that's, like, learning history right now.
00:10:54.120 Like, they just started their second year of college.
00:10:56.640 Oh, that Celsius.
00:10:57.240 Yeah, that thing will definitely – you won't be able to blink for a month, homie.
00:11:00.340 That thing will – yeah.
00:11:01.520 You'll sneeze and release 5.0, dude.
00:11:03.760 You'll freaking – are you guys at 4.5 already?
00:11:05.740 We're at 4.5 already.
00:11:06.580 5.0 is – I think it's going to be great.
00:11:08.540 Oh.
00:11:08.820 It will come out fast if you add that Celsius.
00:11:10.540 I'm just saying you pour it.
00:11:11.080 Maybe the researchers need it, not me.
00:11:12.600 But, you know, we'll get them some.
00:11:14.180 Yeah, that thing will get you there, man.
00:11:16.480 So say there's somebody just, for example, like there's learning history right now.
00:11:19.100 They're in their second year of college.
00:11:20.160 They're taking history.
00:11:21.200 Is that – are there some subjects in, like, they're going to be a historian?
00:11:26.260 Is that still a viable space of work as AI moves forward, do you think, honestly?
00:11:31.900 I assume there will be some version of it that is – I think it's very hard to predict exactly how something evolves or predict exactly what the jobs of the future are going to be.
00:11:46.940 Like, the – you know, not that long ago, it would have been very hard to predict either of our jobs.
00:11:56.080 If you go back 100 years, the idea of, like, this CEO of an AI company or a podcaster, like, you know, probably would have been things that didn't seem to be the most obvious evolutions of the things people were doing at the time.
00:12:11.020 Yeah, you just seemed almost probably crazy even in trying to explain those to someone.
00:12:15.380 You would.
00:12:15.900 And now, in fact, two of the – I heard that the job that young people most want is some version of your job.
00:12:21.780 The job that young people most want is to be, you know, a podcast influencer, YouTube.
00:12:27.520 They want a YouTube channel.
00:12:28.340 Like, whatever it is, they – like, six, seven-year-olds, they don't know how to describe it, but that's what they want.
00:12:33.360 And a lot of people also want my job.
00:12:34.860 They want to do, like, a startup or they want to work on AI.
00:12:37.040 And these just didn't exist.
00:12:38.920 So, like, the rate with which the new things come along is fast and also trying to predict what they are.
00:12:45.760 I don't know.
00:12:46.500 The thing I say all the time is no one knows what happens next.
00:12:48.800 It's like, we're going to figure this out.
00:12:50.340 It's this weird emergent thing.
00:12:51.780 Does the current job of a historian exist in the same way?
00:12:55.560 I would bet not quite.
00:12:56.840 But another thing I believe is that humans are obsessed with other people.
00:13:01.200 Like, we are so deeply wired to care about other people, to care about stories.
00:13:04.820 And history, our own history, is extremely interesting to us.
00:13:07.940 So, I would say somehow or other, we're still going to care about that.
00:13:11.640 There's going to be some kind of job doing that.
00:13:13.280 Man, that's cool.
00:13:15.240 I guess I – if – when I take that avenue of thought, like, okay, there will still be this historian or somebody, it will be some evolution of that, right?
00:13:25.200 That does seem kind of cool to me because there's a level of creativity in there.
00:13:28.980 There's a level of, like, faith and spontaneity in there that I think is kind of exciting.
00:13:33.700 So, yeah, I guess I hadn't really thought about that.
00:13:35.280 As soon as I get stuck in this doomsday thing, like I just see, like, you know, like the history book closes and they're like, we have enough.
00:13:41.380 We have all the history over here, you know?
00:13:43.700 You know, people used to say, like, oh, there's no need for more music.
00:13:46.780 We've made perfect music.
00:13:48.340 Like, why does anyone need anyone to create anymore?
00:13:50.300 Yeah.
00:13:50.500 And that's obviously ridiculous.
00:13:51.960 Yeah.
00:13:52.180 Or they would say there's that famous patent office quote, everything that humans ever possibly need has been invented.
00:13:56.860 There's nothing left to do.
00:13:57.940 I have heard that.
00:13:59.000 But here we are.
00:13:59.700 Here we are.
00:14:00.340 And, like, someone asked me the other day, like, you know, how long is it until you can make, like, an AI CEO for OpenAI?
00:14:08.500 And I was like, probably not that long.
00:14:09.920 And they were like, well, aren't you really sad about that?
00:14:12.640 And I was like, no, I think it's awesome.
00:14:14.160 I'm for sure going to figure out something else to do.
00:14:15.980 I'm excited to do that.
00:14:17.140 Like, I think that's great.
00:14:19.380 Right.
00:14:19.920 So you could create something that would have your job, but then you could do something else.
00:14:26.660 But then how do you know that you'll still get paid for your job, I guess?
00:14:29.900 Like, well, it's kind of a big question.
00:14:32.420 I kind of think that.
00:14:34.020 But yeah, I guess the frame of that question might be better.
00:14:35.900 Like, say there are jobs that get curtailed by.
00:14:38.020 There will be some.
00:14:38.780 Okay.
00:14:39.020 I think it's important to be honest about that.
00:14:40.660 There will be some jobs that totally go away.
00:14:43.420 But mostly, I think we will rely on the fact that people's desire for more stuff, for better experiences, for, you know, a higher social status or whatever, that seems basically limitless.
00:14:57.140 Human creativity seems basically limitless.
00:14:59.260 And human desire to be useful to each other and to connect with each other and focus on other people seems pretty limitless, too.
00:15:06.540 So I think throughout all of history, there have been these predictions like, you know, we're going to like all be on the beach and work an hour a day or hour a week or whatever, and we're going to have unlimited wealth.
00:15:19.520 And I've never heard that.
00:15:21.520 I've never heard that.
00:15:21.960 I mean, I love that.
00:15:23.280 I mean, they used to say this.
00:15:24.340 They used to say, like, the Industrial Revolution, people were like, oh, you know, we just figured out how to automate, like, man's lot in life.
00:15:30.280 There's nothing left to do.
00:15:31.260 We're going to have these machines do all the work.
00:15:32.500 It makes sense, probably.
00:15:33.540 And you watch these machines doing all this stuff that only people used to physically do.
00:15:38.120 And everybody panicked and said, there's going to be no more jobs.
00:15:41.520 And we figured out new stuff to want.
00:15:44.640 Now, here's an interesting thing.
00:15:45.860 If you could go back to that Industrial Revolution time and people before that were, you know, really on the grind, working super hard, trying to, like, kind of have enough food to survive.
00:15:56.500 Go back to those people.
00:15:58.660 Look at our jobs today.
00:16:00.100 Would those people say we have real jobs?
00:16:01.980 Or would they say you have unbelievable abundance, unbelievable wealth, so much food to eat, incredible luxury, and you guys are just, like, playing a game to entertain yourselves?
00:16:10.680 Is that a real job or not?
00:16:12.000 And they would probably say where they sit, what you guys do is not a real job.
00:16:15.640 You guys are, you know, you're too rich.
00:16:18.140 You're wasting your time.
00:16:18.920 You're trying to, like.
00:16:20.180 Yeah, you guys are a couple of dang zest lords out there freaking playing Uno in the park or whatever.
00:16:24.720 They would not.
00:16:25.280 I don't think my grandfather would be like, you have a job.
00:16:27.180 He would still be like, you need to get a job.
00:16:28.840 Yeah, totally.
00:16:29.820 Yeah.
00:16:30.260 And when we look forward to another hundred years of what people are doing, they'll probably think they're working very hard.
00:16:35.460 It'll feel very satisfying, very intense to them.
00:16:37.540 They're really, like, they'll feel engaged.
00:16:39.260 They'll be making my people happy.
00:16:40.600 They'll be creating value for each other.
00:16:41.940 But if we could look forward to that hundred years of those guys, do you think we would say they're working?
00:16:46.080 Or, like, man, you have, like, AI doing everything for you.
00:16:48.260 You're just trying to entertain yourselves.
00:16:49.640 Yeah, like, oh, you guys have it so easy.
00:16:52.060 Right.
00:16:52.560 But I think that's beautiful.
00:16:53.620 I think it's great that those people in the past think we have it so easy.
00:16:56.440 I think it's great that we think those people in the future have it so easy.
00:16:58.820 Like, that is the beautiful story of us all contributing to human progress and everybody's lives getting better and better.
00:17:04.940 Say we're able to get to that space, right, like the movement that happens with AI and with just technology, which will advance quicker, I think, which is one thing that AI feels like to me it's a fast-forward button on technology and on possibility because things can be – information can be quantified so quick.
00:17:22.480 And a lot of, like, more menial tasks, even though they're not really menial in people's lives, but menial hypothetically can be done quicker to get a lot of the framework for things done fast.
00:17:34.100 But how will people survive?
00:17:36.240 Like, how do we adjust our structure of, like, if some people own the companies that have the AI and then a lot of people are just using the AIs and the agents created by AIs to do things for them?
00:17:51.760 How will society, like, societal members still be able to financially survive?
00:17:56.420 Will there still be money?
00:17:57.480 What is that – does that make any sense, that question?
00:17:59.680 It totally makes sense.
00:18:00.660 Okay, sorry.
00:18:01.200 I don't know.
00:18:01.940 Neither does anybody else.
00:18:02.680 But I'll tell you my current best guess.
00:18:04.380 Okay.
00:18:04.880 Well, I'll say two guesses.
00:18:06.180 One, I think it is possible that we put, you know, GPT-7 or whatever in everybody's chat GPT.
00:18:13.520 Everybody gets it for free.
00:18:14.760 And everybody has access to just this, like, crazy thing such that everybody can be more productive, make way more money.
00:18:21.880 It doesn't actually matter that you don't, like, own the cluster itself.
00:18:24.720 But everybody gets to use it, and it turns out even getting to use it is enough that people are, like, getting richer, faster, and more distributed than ever before.
00:18:32.420 That could happen.
00:18:33.220 I think that really is possible.
00:18:35.340 There's another version of this where the most important things that are happening are these systems are discovering, you know, new cures for diseases, new kinds of energy, new ways to make spaceships, whatever.
00:18:49.040 And most of that value is accruing to the, like, cluster owners, us, just so that I'm not dodging the question here.
00:18:55.260 And then I think society will very quickly say, okay, we got to have some new economic model where we share that and distribute that to people.
00:19:05.060 I used to be really excited about things like UBI.
00:19:07.800 I still am kind of excited, like universal basic income where you just give everybody money.
00:19:11.300 Yeah, you hear that term a lot.
00:19:12.120 Yeah, universal basic income.
00:19:13.600 Yeah, I heard you and Rogan talk about that too a while back.
00:19:15.680 I still am kind of excited about that, but I think people really need agency.
00:19:20.260 Like, they really need to feel like they have a voice in governing the future and deciding where things go.
00:19:26.220 And I think if you just, like, say, okay, AI is going to do everything and then everybody gets, like, a, you know, dividend from that, it's not going to feel good.
00:19:37.300 And I don't think it actually would be good for people.
00:19:39.640 So I think we need to find a way where we're not just, like, if we're in this world, where we're not just distributing money or wealth.
00:19:48.020 Like, actually, I don't just want, like, a check every month.
00:19:51.620 What I would want is, like, an ownership share in whatever the AI creates so that I feel like I'm participating in this thing that's going to compound and get more valuable over time.
00:19:58.640 So I sort of like universal basic wealth better than universal basic income.
00:20:02.900 And I think I don't like basic either.
00:20:04.840 I want, like, universal extreme wealth for everybody.
00:20:07.680 But even then, like, I think what people really want is the agency to kind of co-create the future together.
00:20:13.960 And in a world where it's, like, the AI is mostly coming up with the new scientific inventions, at least we've got to still have humans, like, invent the new culture and have that be a very distributed thing.
00:20:29.600 Okay.
00:20:30.000 I guess, yeah, I see what you're saying.
00:20:32.380 But would that be, like, an American thing, do you think?
00:20:35.140 Like, since they were invented here?
00:20:36.620 Or do you think, I'm just wondering, what does that look like, you know?
00:20:40.920 The economic model of it all or the whole thing?
00:20:42.860 Yeah, or, like, is there a dividend of the company that's in is divided up between the masses, sort of?
00:20:50.200 I mean, a crazy idea, but in the spirit of crazy ideas, is that if the world, there's, like, roughly 8 billion people in the world.
00:20:57.140 If the world can generate, like, 8 quintillion tokens per year, if that's the world.
00:21:04.380 Actually, let's say the world can generate 20 quintillion tokens per year.
00:21:09.300 Tokens of?
00:21:10.080 Like, each word generated by an AI.
00:21:12.080 Okay.
00:21:12.320 Just making up a huge number here.
00:21:13.820 We'll say, okay, 12 of those go to, you know, the normal capitalistic system.
00:21:17.780 But 8 of those, 8 quintillion tokens, are going to get divided up equally among 8 billion people.
00:21:23.640 So everybody gets 1 trillion tokens.
00:21:26.460 And that's your kind of universal basic wealth globally.
00:21:30.740 And people can sell those tokens.
00:21:32.560 Like, if I don't need mine, I can sell them to you.
00:21:34.480 We could pool ours together for some, like, new art project we want to do.
00:21:37.680 But instead of just, like, getting a check, everybody on Earth is getting, like, a slice of the world's AI capacity.
00:21:44.720 And then we're letting the, like, massively distributed human ingenuity and creativity and economic engine do its thing.
00:21:51.380 I mean, that's, like, a crazy idea.
00:21:53.380 Maybe it's a bad one.
00:21:54.460 But that's the kind of thing that I think sounds like someone should think about it more.
00:21:58.260 One of the big fears is, like, purpose, right?
00:22:00.220 Like, human purpose.
00:22:01.120 Like, work gives us purpose.
00:22:02.220 And also, I think the idea that we are the ones advancing humanity gives us purpose.
00:22:09.200 Like, we are the, like, yeah, like, we have some control over our own destiny maybe gives us this sense of purpose.
00:22:20.420 And it feels like that we would lose a sense of purpose or that purpose would be adjusted, like, if AI is to really, you know, continue to advance so quickly.
00:22:30.820 It feels like our sense of purpose would start to really disappear.
00:22:36.060 Have you had thoughts about that?
00:22:37.540 I worry about this a lot.
00:22:38.860 So, I think people have worried about this with every big technological revolution.
00:22:42.720 Got it.
00:22:42.840 But I agree that this time it feels different.
00:22:44.660 Okay, yeah, because if, say you had an axe and some guy came out with a saw, you're like, yeah, that's, you know.
00:22:50.480 Or even if they come out with, like, a robot that cuts the tree down, it still feels fine.
00:22:54.880 But, like, creativity intelligence, I think, cuts so deeply at the core of whatever we are and how we value ourselves.
00:23:05.420 One example we can look at this right now, I think one area where AI is having a big impact is on how people write software for a living.
00:23:11.940 And AI is really good at that.
00:23:13.280 It's really changed what it means to be a software developer.
00:23:17.740 I haven't heard any of those software developers say that they, even though their job is different, that they don't have meaning.
00:23:22.740 They still enjoy it.
00:23:23.780 They're operating at a higher level.
00:23:26.900 And I'm hopeful, at least for a long time, you know, 100 years from now, who knows.
00:23:31.260 But I'm hopeful that that's what it'll feel like with AI is even if we're asking it to solve huge problems for us.
00:23:36.080 Even if we ask it to say, like, you know, go discover a cure for cancer.
00:23:40.500 There will still be a lot of things to do in that process that feel valuable to a person.
00:23:50.420 You're still asking it the questions.
00:23:51.840 You're still, like, helping guide it.
00:23:53.020 You're still framing it or whatever it is.
00:23:54.660 You're still, like, talking to the world about it.
00:23:56.400 And I think all of human history suggests that we find a way to put ourselves at the center of the story and feel really good about it.
00:24:10.080 Like, you know, if you kind of think, like, we used to think that the Earth was the center of the solar system.
00:24:17.460 And then we're, like, very human-centric view.
00:24:19.720 And then we're, like, okay, fine.
00:24:20.680 The sun is the center of the solar system.
00:24:22.200 But the solar system is at least the center of the galaxy.
00:24:25.100 And now, oh, man, there's a lot of galaxies.
00:24:26.840 And, oh, man, now we're this, like, tiny speck in this, like, very huge universe.
00:24:31.820 And yet we still manage to feel all, like, a lot of main character energy.
00:24:36.880 And so I somehow think even in a world where AI is doing all of this stuff that humans used to do, we are going to find a way in our own telling of the story to feel like the main characters.
00:24:47.420 And I think in an important sense we will be.
00:24:50.240 And that's really good.
00:24:52.300 I also, like, you know, probably already today there could be a very compelling version of two AIs talking like this.
00:25:02.780 And I don't think I'd want to watch that.
00:25:04.220 Like, I think I really do feel deeply wired to, like, care about the real person behind it.
00:25:10.120 I think that's, like, deep in the biology.
00:25:11.780 Right.
00:25:13.100 Yeah, that's the part that I think a lot of times it's, like, even though you can get into, like, these wormholes of, like, possibility and these fear holes of possibility or kind of these dystopian ideas, that in the end I'm, like, I'd rather probably watch something that's real, you know?
00:25:29.900 It's, like, because I'm real, you know what I'm saying?
00:25:32.400 Like, I don't want to talk really to a robot.
00:25:34.960 I'd, you know.
00:25:36.620 Yeah, I think in the end there's going to be a part of you that wants to continue to just talk to humans.
00:25:42.840 Do you – what's, like, one of your fears?
00:25:46.680 Like, what's a fear you have of AI?
00:25:49.800 Like, if you have, like, a fearful space that it could go?
00:25:52.760 Like, I know you mentioned it a little bit.
00:25:54.720 This morning I was testing our new model and I got a question.
00:25:58.980 I got emailed a question that I didn't quite understand.
00:26:03.140 And I put it in the model, this GPT-5, and it answered it perfectly.
00:26:07.200 And I really kind of sat back in my chair and I was just, like, oh, man, here it is moment.
00:26:15.820 And I got over it quickly.
00:26:17.160 I got busy on to the next thing.
00:26:19.160 But it was, like, I mean, it was what kind of we were talking about.
00:26:21.380 I felt, like, useless relative to the AI in this thing that I felt like I should have been able to do and I couldn't and it was really hard.
00:26:27.960 But the AI just did it like that.
00:26:29.220 Yeah.
00:26:29.900 It was a weird feeling.
00:26:30.860 Yeah, I think that's – I think that feeling right there, that's the feeling a lot of people kind of have, like, what's going – you know, when does it happen?
00:26:41.120 What's going to happen?
00:26:43.920 But I think some of it is it's like, yeah, it's hard to conceptualize until you're further along.
00:26:49.780 I'm – totally.
00:26:51.520 I don't think we know quite how that's going to feel.
00:26:53.620 You just have to, like, approach it step by step.
00:26:55.520 Another thing I'm afraid of and we had a, you know, a real problem with this earlier but it can get much worse is just what this is going to mean for users' mental health.
00:27:09.600 There's a lot of people that talk to ChatGPT all day long.
00:27:12.140 There are these sort of new AI companions that people talk to like they would a girlfriend or a boyfriend.
00:27:16.560 And we were talking earlier about how it's probably not been good for kids to, like, grow up, like, on the dopamine hit of scrolling, you know, TikTok or whatever.
00:27:27.480 Yeah, do you think that – how do you keep, like, AI from having that same effect, like, that negative effect that social media really has had?
00:27:35.460 I'm scared of that.
00:27:37.000 I don't have an answer yet.
00:27:38.780 I don't think we know quite the ways in which it's going to have those negative impacts.
00:27:42.940 But I feel for sure it's going to have some and we'll have to – I hope we can learn to mitigate it quickly.
00:27:50.520 Can AIs – can they pull up pornography and stuff like that too or no?
00:27:54.100 Sure.
00:27:54.560 Oh, my God.
00:27:56.800 God, I didn't know that.
00:28:00.260 No, it's fine.
00:28:01.420 Yeah, but I just – yeah, I don't even need to know that.
00:28:04.420 I'm going to have that stricken from my own record.
00:28:07.940 Crypto is – it's kind of blowing up again, you know?
00:28:11.480 Some people say it's back.
00:28:12.560 It's not back.
00:28:13.220 It's – one of the best things is it hasn't left.
00:28:16.720 It has maintained itself as a viable form of currency.
00:28:20.820 And I'm back in.
00:28:22.320 I'm back invested.
00:28:23.540 And when I need more Bitcoin or Solana or XRP, MoonPay is always the first app I open.
00:28:31.420 Since MoonPay works with Apple Pay, Venmo, PayPal, bank accounts, and credit cards, it's fast and easy to get what I need in a few clicks.
00:29:11.660 While keeping full control of your assets.
00:29:15.140 Remember, while MoonPay makes buying crypto straightforward, it's essential to do your own research and understand the risks involved.
00:29:22.240 Crypto trading can be volatile and you could lose your investment.
00:29:25.560 MoonPay is a tool to facilitate your transactions, not a source of financial advice.
00:29:30.320 Trade responsibly.
00:29:31.320 What happens when your health becomes the punchline?
00:29:35.420 That's my question.
00:29:36.960 It feels like that's where we're at.
00:29:38.600 From seed oils to stress, toxins, pollutants, the modern world is screwing with our health at the cellular level, leading to exhaustion, brain fog, digestive issues, and more.
00:29:53.540 But here's the thing.
00:29:54.700 You don't have to settle for feeling like garbage 24-7.
00:29:57.900 Armra Colostrum is nature's original health hack.
00:30:03.860 Packed with over 400 bioactive nutrients that fortify gut integrity, strengthen immunity, revitalize hair growth, fuel stamina, elevate focus, and help you function like a human again.
00:30:16.640 I love using it in my smoothies at home.
00:30:19.580 I'll make me a little smoothie.
00:30:20.880 Bang dong.
00:30:22.100 Put in some blueberries, spinach, banana, banana, hit it with a little packet of Armra Colostrum.
00:30:28.780 We've worked out a special offer for our audience here.
00:30:31.800 Receive 15% off your first order.
00:30:33.800 Go to tryarmra.com slash theo or enter T-H-E-O to get 15% off your first order.
00:30:41.260 That's T-R-Y-A-R-M-R-A dot com slash theo.
00:30:49.100 What legal system does AI have to work by?
00:30:52.820 Is there like a legal – like we have laws like in the world, right, like in the human world.
00:30:59.040 Does AI have to work by any like legal laws, you know?
00:31:03.400 Yeah, so I think we will certainly need a legal or a policy framework for AI.
00:31:11.860 One example that we've been thinking about a lot – this is like a – maybe not quite what you're asking.
00:31:16.140 This is like a very human-centric version of that question.
00:31:19.100 People talk about the most personal shit in their lives to ChatGPT.
00:31:22.520 It's – you know, people use it.
00:31:24.000 Young people especially like use it as a therapist, a life coach, having these relationship problems.
00:31:28.260 What should I do?
00:31:28.860 And right now if you talk to a therapist or a lawyer or a doctor about those problems, there's like legal privilege for it.
00:31:37.040 You know, like it's – there's doctor-patient confidentiality.
00:31:40.140 There's legal confidentiality, whatever.
00:31:42.880 And we haven't figured that out yet for when you talk to ChatGPT.
00:31:47.500 So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, like we could be required to produce that.
00:31:53.400 And I think that's very screwed up.
00:31:54.600 I think we should have like the same concept of privacy for your conversations with AI that we do with a therapist or whatever.
00:32:03.640 And no one had to think about that even a year ago.
00:32:06.140 And now I think it's this huge issue of like how are we going to treat the laws around this?
00:32:10.300 Well, do you think there should be like kind of like a slowing things down before we move there kind of?
00:32:15.460 Because, yeah, that is kind of wild.
00:32:16.800 It's one of the reasons I get scared sometimes to use certain AI stuff because I don't know how much personal information I want to put in because I don't know who's going to have it.
00:32:25.260 I think we need this point addressed with some urgency.
00:32:28.880 And, you know, the policymakers I've talked to about it like broadly agree.
00:32:32.220 It's just – it's new and now we got to do it quickly.
00:32:34.760 Do you talk to ChatGPT?
00:32:36.420 I don't talk to it that much.
00:32:37.680 One of the –
00:32:38.140 It's because of this?
00:32:39.160 I think it is.
00:32:39.840 It's because it's like –
00:32:40.720 I think it makes sense.
00:32:42.380 To not talk to ChatGPT because of it?
00:32:43.700 No, no, no.
00:32:44.040 Like to like really want the privacy clarity before you use it a lot.
00:32:48.120 Yeah.
00:32:48.380 Like the legal clarity.
00:32:49.500 Yeah, it's scary.
00:32:50.280 And it's like, well, how long does it take lawmakers to come up with that?
00:32:53.080 And then it feels like it's moving so fast that it's like it doesn't even – that sometimes it's like it doesn't even really matter.
00:32:59.200 It's like are we even waiting for the laws to be put around this or what's going on?
00:33:04.080 Does it feel like it's moving too fast for you sometimes?
00:33:08.040 The last few months have felt very fast.
00:33:10.000 It feels faster and faster, but the last few months have felt very fast.
00:33:13.040 Yeah, I was watching this guy, Yoshua Bengio?
00:33:17.340 Yoshua Bengio.
00:33:18.240 Yoshua Bengio.
00:33:19.980 And he's kind of like – some people call him the father of AI.
00:33:22.900 He may be self-proclaimed.
00:33:24.020 I'm not really sure.
00:33:25.760 But he's certainly seen to be kind of like a lifeguard for AI, like thinking about like how do we keep the pool safe, how much water should be in it, the chlorine, how many lifeguards do you need on duty, that type of thing hypothetically.
00:33:38.600 And he was saying that some AIs, they have like deception techniques inside of them, like that there were AIs that would rather give you an answer that was possibly pleasing to the user.
00:33:55.120 Then to give them the factual answer.
00:33:58.320 And then he was also saying that there were AIs that were developing some of their own languages to communicate with each other, which would be languages that we don't even know.
00:34:10.260 What is that?
00:34:10.840 How do you guys curtail that when those types of things come up?
00:34:13.440 What does that even kind of feel like to you guys?
00:34:16.040 Are these just problems that happen in new spaces and you figure it out as you go?
00:34:21.520 There are these moments in the history of science where you have a group of scientists look at their creation and just say, you know, what have we done?
00:34:31.600 Maybe it's great.
00:34:32.940 Maybe it's bad.
00:34:33.700 But what have we done?
00:34:34.400 Like maybe the most iconic example is thinking about the scientists working on the Manhattan Project in 1945, sitting there watching the Trinity test and just, you know, this thing that it was a completely new, not human scale kind of power.
00:34:52.140 And everyone knew it was going to reshape the world.
00:34:55.380 And I do think people working on AI have that feeling in a very deep way.
00:35:00.500 You know, we just don't know.
00:35:02.680 Like we think it's going to be great.
00:35:03.920 There's clearly real risks.
00:35:06.000 It kind of feels like you should be able to say something more than that.
00:35:09.980 But in truth, I think all we know right now is that we have discovered, invented, whatever you want to call it, something extraordinary that is going to reshape the course of human history.
00:35:20.760 Dear God, man.
00:35:21.740 But if you don't know, we don't know.
00:35:24.660 Well, of course.
00:35:26.120 I mean, I think no one can predict the future.
00:35:29.240 Like human society is very complex.
00:35:31.220 This is an amazing new technology.
00:35:32.520 Maybe a less dramatic example than the atomic bomb is when they discovered the transistor a few years later.
00:35:40.220 The transistor radio?
00:35:41.200 The transistor part that, you know, made computers and radios and everything else.
00:35:45.280 But we discovered this completely new thing that enabled the whole computer revolution and is in this microphone and those computers and our iPhones and like the world would be so different if people had not discovered that.
00:35:56.060 And then over the decades figured out how to make them smaller and more efficient.
00:35:59.380 And now we don't even think about it because the transistors are just in everything.
00:36:03.080 We have all this modern technology from that one scientific discovery.
00:36:06.480 And I do think that's what AI is going to be like.
00:36:09.280 We had this one crazy scientific discovery that led to these language models we all use now.
00:36:15.800 And that is going to change the course of society in all kinds of ways.
00:36:21.340 And, of course, we don't know what they all are.
00:36:23.480 Damn.
00:36:24.180 I was hoping you knew by the end of that sentence.
00:36:26.760 Or I was hoping you would, you know.
00:36:28.600 Like, that's what we're – because we don't know.
00:36:31.720 You know, like, that's, I think, the tough thing.
00:36:33.760 There's no time in human history at the beginning of a century where the people ever knew what the end of the century was going to be like.
00:36:39.400 Yeah.
00:36:39.880 So maybe it's – I do think it was faster and faster each century.
00:36:44.620 Certainly, like, you know, in 1900, you couldn't have predicted what 2000 was going to be like.
00:36:48.800 I think in 2000, you could even less predict what 2100 was going to look like.
00:36:53.480 But that's kind of why it's exciting.
00:36:55.060 And, like, that's kind of why people get to figure out and unfold the story as we go.
00:36:59.200 It's kind of bizarre because there's a part of me that's like, this guy is out of his mind.
00:37:02.880 This guy is a wild wizard.
00:37:05.320 You know, there's a couple different things.
00:37:06.800 But then there's also this part of me that's like, this guy is this hopeful guy who's, like, involved in this crazy space.
00:37:15.680 And he kind of has this whimsical energy about the future, which is, in a crazy way, a nice energy to have about the future generally is that something could happen or that things are possible.
00:37:30.880 So it just – yeah, it's all kind of like – I don't know.
00:37:34.040 It's fascinating.
00:37:34.860 It's definitely fascinating to me.
00:37:37.340 Sam, to kind of pivot a little bit, there's – it feels like there's a race right now in AI, right?
00:37:42.180 Would you say that there's a race between companies and AI?
00:37:45.400 It certainly feels that way.
00:37:46.920 Yeah.
00:37:47.700 And it almost feels like you guys are the new Formula One drivers or you guys are, like, the new, like – it's like Mario Andretti or you guys are the new, like, Bubba Watts and all the – you know.
00:37:59.200 It's almost like these are the new race cars that everybody's kind of watching position themselves.
00:38:06.700 What is the race for?
00:38:08.340 Because you hear about AI and then you hear about AGI and then you hear about superintelligence.
00:38:15.580 What is this race that's going on?
00:38:18.020 How real is it?
00:38:19.040 And what is the race for?
00:38:21.320 When I was a kid, the race was, like, the megahertz race and then it became the gigahertz race.
00:38:25.320 Everybody wanted a computer with a faster processor.
00:38:27.740 Oh, yeah.
00:38:28.120 You know, Intel would come out with this one and then AMD would come out with this one.
00:38:31.080 And, like, it turned out that those gigahertz measurements eventually were not even that helpful.
00:38:38.440 Like, you could have one that had a lower number and it was – in practice, it was faster.
00:38:43.240 And eventually, I think it was Apple that realized they should just stop talking about the clock speed of their computers.
00:38:49.160 And you probably don't even know what the processor speed of your iPhone is today.
00:38:52.820 Yeah, it's true.
00:38:53.280 Yeah, that was a big thing and it kind of disappeared.
00:38:55.240 And I think the same thing has been happening in AI where everybody was racing on these benchmarks.
00:39:00.280 You know, I score this on this benchmark and this on that one.
00:39:03.600 And now people are realizing that, like, okay, the benchmarks are kind of saturated.
00:39:07.520 We went through the equivalent of our megahertz race with our benchmark race.
00:39:11.660 And now people kind of don't care about that as much.
00:39:14.880 And now it's, like, who's using the model?
00:39:18.120 Who's getting the value out of it?
00:39:19.920 Things like that.
00:39:20.780 But I do think people still feel like we're heading towards some milestone.
00:39:28.900 What the milestone is, they disagree on.
00:39:30.980 But maybe it's a system that's capable of doing its own AI research and its own sort of self-improvement.
00:39:37.560 Maybe it's a system that is, like, smarter than all of humans put together.
00:39:41.060 But they feel like there is some finish line to cross.
00:39:45.520 I actually don't quite feel like this.
00:39:46.660 But I think a lot of people in the industry do that there's some finish line that we're going to cross.
00:39:50.160 Maybe it's this, like, self-improvement moment.
00:39:53.440 Maybe you call that superintelligence.
00:39:56.120 And I think there is a sort of, there's, like, a race to get somewhere, but people don't agree on where it's to or something.
00:40:04.800 What are you racing towards, do you feel like?
00:40:08.440 It's a great question.
00:40:10.860 I don't have, like, a finish line in mind.
00:40:13.940 There's nothing I could say.
00:40:15.320 I don't think I can articulate anything where I would say, like, this is mission complete.
00:40:18.920 But if I had to give, like, a self-referential answer there, you know, the moment where we would rather give our research cluster, like, our, you know, GPUs that we run all of our AI experiments on, the moment where we would rather give that to an AI researcher rather than our brilliant team of human researchers, that does at least seem like some kind of very different new era.
00:40:43.880 Yeah, and at that point, who's even we?
00:40:46.840 I feel like it's just you kind of, like, wheeling the stuff across the hall and, you know, like, who's going to, you know what I'm saying?
00:40:51.520 Like, you know what I'm saying?
00:40:52.580 It starts to get this idea, like, if we keep, if things were to keep leaving the people and go to the computer, you're just shoveling coal into the AI hypothetically, you know?
00:41:04.140 Well, again, I assume that what will happen, like with every other kind of technology, is we'll realize, like, we, there's this one thing that the tool's way better than us at.
00:41:13.540 Now we got to go solve some other problems, so let's put our brain power there.
00:41:16.100 I, I still don't think it'll ever feel like we all just get to, like, push a button and go on vacation.
00:41:21.600 Got it.
00:41:22.980 Um, like, we will, I think as, what one version of this is as, uh, as capabilities go up, because as we get better tools, the expectation goes way up too.
00:41:33.480 And so we've got to, like, yes, we get much better tools, but we have to do way more to remain competitive.
00:41:39.240 Well, I think there's this hopeful idea, say if you come up with all these.
00:41:41.900 Or maybe not, like, maybe, maybe the AI is just better than us at absolutely everything.
00:41:44.740 And we just sit there being like, all right, that was cool.
00:41:47.580 Yeah, because at a certain point, if something has all the information, right?
00:41:51.780 If something has all the information and it can think and, and, and ponder and, uh, pontificate and serve multi options of answers, don't we then work in for that thing?
00:42:02.200 Like, that's what I start to wonder.
00:42:03.380 Like, if it's the smartest thing in the room.
00:42:06.760 GPT-5 is the smartest thing.
00:42:07.880 GPT-5 is smarter than us in almost every way, you know?
00:42:11.360 And yet, here we are.
00:42:12.920 So there's like, there's something about the way the world works.
00:42:17.140 There's something about, this doesn't mean it's true forever, but there's something about what humans can do today that is so different.
00:42:23.760 There's also something about what humans care about today that is so different than AI that I don't think the simplistic thing quite works.
00:42:31.080 Now, again, by the time it's a million times smarter than us, who knows?
00:42:34.780 Is part of you want to kind of get there?
00:42:36.840 Like, how do we get, we're like, I open the door and you, and I say, excuse me, sir.
00:42:41.180 And it's just my computer in there.
00:42:42.700 You know what I'm saying?
00:42:43.500 Like, you know, when, when I was a kid, I, I sort of thought about these technological revolutions that happened one at a time.
00:42:54.600 There was the agriculture revolution a long time ago, and that freed us up to do these other things.
00:42:58.940 And then there was the like, there was the age of enlightenment, and there was the industrial revolution, and there was the computer revolution, and all these things happened.
00:43:05.640 And I thought of them as like these distinct things.
00:43:08.180 And now I view it as just this one long compounding exponential where all of these things come together.
00:43:17.080 Each piece of technology is built continuously, overlapping on the one that comes before, and we're able to just do more and more.
00:43:25.860 And so in some sense, AI is this big, special, unique, different thing.
00:43:29.080 And in some other sense, it is just part of this long arc of human progress.
00:43:32.620 We talked about the transistor earlier, but like, that was way more important in some sense to AI happening than the work we do now.
00:43:40.040 And all this stuff has to like compound, compound.
00:43:42.100 You got to build the internet.
00:43:42.880 You got to get all this data.
00:43:43.820 You got to do all these things.
00:43:45.800 And, and I want that exponential to keep going.
00:43:50.360 There will be things way after AI.
00:43:51.820 We'll invent all sorts of new things.
00:43:53.080 We'll go colonize space.
00:43:54.060 We'll go, you know, build neural interfaces.
00:43:55.920 Who knows what else we'll do.
00:43:56.800 But I think at some point, AI fades into that arc of history.
00:44:03.080 We build, we don't, we don't even think about it.
00:44:04.880 It's like transistors, if you don't even think about today.
00:44:06.740 It's just another layer in the scaffolding that humans collectively have built up bit by bit over time.
00:44:13.540 And where you sit in our day, you get to open that door.
00:44:16.220 You have this like computer that only has one interface.
00:44:19.500 You just, it says, what do you want?
00:44:21.100 You say whatever you want, it happens.
00:44:22.820 And you figure out amazing new things to build for the next generation and the next and the next.
00:44:26.960 And we just keep going.
00:44:29.360 Yeah.
00:44:29.740 I think the, the part that I think gets spooky is I can't build any, I can build some stuff, but I can't build like any technological stuff.
00:44:38.500 So then I'm like, dang dude, well, I'm not gonna, what am I going to build over there?
00:44:43.000 Okay.
00:44:43.200 So right now I can write software.
00:44:45.080 Maybe you can't.
00:44:45.940 And I have a little advantage if I want to go build some technological thing.
00:44:49.000 Very soon you can make any piece of software you want.
00:44:52.020 Cause you just ask an AI in English.
00:44:53.600 You say, I got an idea for an app.
00:44:54.900 Make me this thing.
00:44:55.600 And the whole thing just happens.
00:44:57.760 So that's a win for you.
00:44:59.080 Maybe it's a little bit of a loss for me.
00:45:00.280 I think it's kind of cool for the whole world.
00:45:02.060 Yeah.
00:45:02.240 But like this is, this is going to be a technology that anybody can use.
00:45:07.460 You can just like with natural language, you can say, this is what I want.
00:45:12.040 And it goes off and writes the code for you, debugs it for you, deploys it for you.
00:45:15.260 And then you can say, how do I use what I just created?
00:45:17.340 Yeah.
00:45:17.540 But if you have a great idea, AI will just make it happen for you.
00:45:21.360 And this is a new thing.
00:45:22.500 Like this is this, I think this will make technology the most accessible it ever has been.
00:45:26.320 Got it.
00:45:26.900 Okay.
00:45:27.360 Then that seems a little bit different.
00:45:28.760 I think there's this idea in my head that I'm going to have to figure out all this coding.
00:45:31.880 I'm going to figure out all of this different ways to do things to even have a possibility of, of use of myself in the future.
00:45:39.320 No, I think this is, uh, without talking too much about the future and what we're going to launch, like the fact that you will be able to have an entire piece of software created just by like explaining your idea is going to be incredible for humans getting great new stuff.
00:45:56.000 Because right now I think there's like a lot more good ideas than people who know how to make them.
00:45:59.940 And if AI can do that for us, we're really good at coming up with creative ideas.
00:46:04.520 Yeah.
00:46:04.960 I mean, that's one of the things that people like to do.
00:46:06.680 Um, do you think right now, if, if humans, regular average humans, most humans could vote to keep AI going or to stop AI?
00:46:16.480 What do you think?
00:46:16.980 That's a great question.
00:46:18.140 What do you think that they would vote?
00:46:20.020 This is like totally kind of, I don't have any data for this.
00:46:23.440 I would bet most people who use chat GPT, which is a lot of people now, they would say like, keep it going.
00:46:28.680 And most people who don't would say it's scary.
00:46:30.720 Stop it.
00:46:31.480 What do you think?
00:46:33.200 Yeah.
00:46:33.560 I feel like most people would say, stop it, I think, or pause it, take the wheels off of it for a month.
00:46:40.940 That kind of thing.
00:46:41.920 Siphon the gas out of the tank, you know, like that kind of thing.
00:46:44.780 Put sugar in it.
00:46:46.260 I think there would be like that kind of thing.
00:46:48.420 What are you most afraid of with it?
00:46:49.720 Or is it just that we're not going to have purpose and we don't know how it's all going to go?
00:46:52.780 Yeah.
00:46:52.940 I mean, those are some of the huge parts, but I think like there's like probably that.
00:46:57.440 I think that in the end, I think there's a general feeling of like, well, if all the trucking jobs disappear, you know, if those become automated and like, yeah, if everything becomes a robo tax, like, you know, will that feel, you know, where will those people go for jobs?
00:47:16.720 Will everybody just be dancing on TikTok, trying to get people to tip them for trends and stuff, you know, like there's part of that.
00:47:23.080 I had this dream years ago that it all ends with everybody's driving an Uber and literally holding each other at gunpoint to be each other's passengers.
00:47:32.020 Right.
00:47:32.360 Like, get in my car, because that's how bad, like somebody's like, I need to fare more than you do.
00:47:38.080 You know, my whole family's in the backseat, sit, shotgun, we'll get you to where, you know, like people are literally holding each other at gunpoint to subscribe to their OnlyFans and stuff like it's just that dystopian or whatever.
00:47:49.980 Um, so I guess part of that, but then there's a deeper part where it's like, yeah, what comes out of us if it feels like a lot of the regular stuff that gives us purpose that we know right now gives us purpose?
00:48:05.040 Is there a new evolution of our purpose?
00:48:08.940 Is there like a blooming inside of us?
00:48:11.080 Is it this utopian place that you almost think of as like a heaven idea where, you know, people are fed and have enough, you know, can, are provided for, can take care of themselves?
00:48:23.720 I guess that's it, that's it.
00:48:25.420 Or what, because purpose gives people work, work gives people so much of their purpose and so for to lose those things, what is it, what happens, you know?
00:48:31.580 And I know I kind of keep asking that over and over again.
00:48:33.540 You don't really have the answers and that's, it's okay.
00:48:35.780 Of course, how could you?
00:48:37.160 We're not in the future.
00:48:38.160 I mean, I think people really do love to be useful to each other and people love to express their creativity as part of that.
00:48:49.960 And as the long-term trend of society getting richer has continued, more people I think are able to do, get closer to sort of expressing themselves in the best way that they can.
00:49:04.680 Maybe like, you know, as recently as five or 600 years ago, not very many people got to be artists.
00:49:14.680 The world wasn't that rich.
00:49:15.880 There were a limited number of patrons that could like pay you to create art, but there were more than zero.
00:49:20.100 And before that, there were almost none.
00:49:21.860 And then you got this beautiful Italian Renaissance and all of this amazing art because there was like excess capital in the world.
00:49:27.660 And now a lot more people can be artists or a lot more people can start startups, which is another, like for me, that's like my expression of creativity.
00:49:35.040 Right.
00:49:36.880 Or more people can create content.
00:49:39.120 Yeah.
00:49:39.300 And this idea that people can find whatever way they can to express themselves, their talent, their vision for kind of collective love of other people and a care for putting their brick in society's progress.
00:49:59.280 I think that can go really far.
00:50:03.620 Now, what art in the future looks like now that AI can make art or help make art?
00:50:08.600 I don't know.
00:50:08.960 It'll probably be kind of different.
00:50:10.360 What startups will look like in the future when people can kind of just say whatever they want to their AI and it can make this off of them right then?
00:50:17.020 It will kind of be different.
00:50:17.880 But I think it's such a bad bet to assume that either human creativity or human fulfillment from being useful to other people ends.
00:50:27.900 I think we're just, we stay on this exponential and like each year, each decade, our collective standard of living goes way up.
00:50:35.000 The whole world gets way richer.
00:50:36.400 We all get more.
00:50:37.760 We all expect more.
00:50:38.540 And even over like the course, I was thinking recently, like food is so much better than it is when I was a kid.
00:50:47.460 Like the world has just figured out how to make food better.
00:50:50.840 Like we, you know, know how to, we figured out organic vegetables or whatever it is.
00:50:54.020 I don't know.
00:50:54.240 It just tastes much better.
00:50:55.640 And like, I think that's great.
00:50:57.360 I don't want to go back to eating like the frozen carrots or whatever.
00:51:00.060 Yeah, I guess that's a good point.
00:51:01.640 But then there's some, like I saw this thing the other day.
00:51:03.820 It was like a kitchen.
00:51:04.660 They had like one of those robo kitchens or whatever.
00:51:06.740 You know, when you order food from like something dash or whatever, and then you, but it's like Hank's ribs.
00:51:15.320 And then it's like Marty's pizza.
00:51:17.780 And then it's like Susan's salami shop, but they're all the same place, you know?
00:51:23.260 And when you get that from window dash, you don't, you don't like, you feel like something's missing, right?
00:51:30.480 You're like, ah, this is fake.
00:51:31.900 I can tell.
00:51:32.720 I get less enjoyment.
00:51:33.640 You would rather get that food from like the dude who's been making it and perfecting it on the, you know, that little pizza shop on the corner for the last 20 years.
00:51:42.880 Right.
00:51:43.160 Because that's like part of, like that dude is part of the experience.
00:51:45.800 That authenticity is part of the experience.
00:51:47.480 Right.
00:51:47.940 I don't think that goes away with a like fake robotic thing.
00:51:50.780 Okay.
00:51:51.000 Yeah.
00:51:51.560 Yeah.
00:51:51.880 Because I think I start to feel like we're in this universe where it's like, you're walking down the street or something and like a Waymo goes by and it's like, eat now.
00:51:59.720 And you're like, but, and you already did eat.
00:52:01.780 It's just got a bad reading or something.
00:52:03.380 It's got a bad valve in it or something.
00:52:04.860 You're like yelling at it.
00:52:05.920 There's nobody in there.
00:52:06.960 And you're like, I already ate.
00:52:08.240 It's like, sit down and eat now.
00:52:10.720 And it just like fucking uses like a t-shirt cannon to just like shoot a burrito at yours.
00:52:15.220 And then you're sitting there, you're eating that, you know, and then the GLP car goes by, right?
00:52:20.020 And says, I can help you out.
00:52:20.900 Yes.
00:52:21.240 And it's like, obviously you've overeat and you're like, I didn't even want to eat.
00:52:25.220 That thing's messed up, right?
00:52:26.380 You're yelling at a car that has no driver in it.
00:52:28.840 And then it shoots you with three GLP one darts in the neck.
00:52:31.860 And now your wife doesn't even recognize you when you get home or whatever, you know?
00:52:35.780 The fact that you find this so off-putting, I think, is a sign for optimism.
00:52:42.000 Yeah.
00:52:43.220 That's actually a good point.
00:52:44.180 You're wired.
00:52:44.980 You're going to be resistant to that.
00:52:46.200 That's not going to make you happy.
00:52:47.120 That's not going to make other people happy.
00:52:48.480 Now, maybe we get tricked.
00:52:49.420 Like social media tricked us for a while.
00:52:51.280 We got too addicted to feeds, whatever.
00:52:53.540 But we realized like, actually, this is not helping me be my best.
00:52:57.540 You know, like doing the equivalent of getting the like burrito can into my mouth on my phone at night.
00:53:05.320 Like that's not making me long-term happy.
00:53:07.260 Right.
00:53:07.880 And that's not helping me like really accomplish my true goals in life.
00:53:11.600 And I think if AI does that, people will reject it.
00:53:13.740 However, if ChatGPT really helps you to figure out what your true goals in life are and then accomplish those, you know, it says, hey, you've said you want to be a better father or a better, you know, you want to be in better shape or you, you know, want to like grow your business.
00:53:26.840 If you want, we can change that goal and I can help you scroll TikTok all night or, you know, eat the burritos or whatever.
00:53:35.060 And I'll give you the GLP one shots and I'll make you as healthy as you can.
00:53:38.560 But like maybe instead, I can try to help convince you you should go for a run tonight.
00:53:42.060 And I think if AI feels like it is helping you try to accomplish your goals and be your best, that will feel very different than the last generation of technology.
00:53:51.880 Yeah.
00:53:52.580 And you know what?
00:53:53.240 And that's where I'm like, and that's where a kid growing up right now, to them, that would probably, some young people might be like, that makes the most sense.
00:54:00.780 I'm a little older generation.
00:54:02.040 I might be like, oh, that seems a little, but that's always how things are with generation to generation.
00:54:06.260 That's always how it goes.
00:54:06.780 Yeah, you're right.
00:54:07.540 And maybe this is just like a quicker evolution of things.
00:54:10.240 And for young people, it's going to make so much sense.
00:54:12.580 And for older people, it's an energy.
00:54:14.020 You're just going to be like, get off my, you know, avatar lawn or something, you know.
00:54:18.880 But that's the way of societal progress.
00:54:20.760 That's just how it goes.
00:54:21.580 Good point.
00:54:23.340 You know, it's an interesting time for business.
00:54:27.120 Tariff and trade policies are dynamic.
00:54:29.780 Supply chains are squeezed and cash flow is tighter than ever.
00:54:33.480 If your business can't adapt in real time, you're in a world of hurt.
00:54:38.540 You need total visibility from global shipments to tariff impacts to real-time cash flow.
00:54:45.180 That's NetSuite by Oracle, your AI-powered business management suite, trusted by over 42,000 businesses.
00:54:53.840 NetSuite is the number one cloud ERP for many reasons.
00:54:57.320 It brings accounting, financial management, inventory, HR into one suite.
00:55:03.480 You have one source of truth, giving you the visibility and control you need to make quick decisions.
00:55:09.260 NetSuite helps you know what's stuck, what it's costing you, and how to pivot fast.
00:55:14.600 If I needed this product, this is what I would use, NetSuite, by Oracle, one of the most trusted companies in the world.
00:55:21.940 It's one system, full control, tame the chaos with NetSuite.
00:55:27.060 If your revenues are at least in the seven figures, download the free e-book, Navigating Global Trade, three insights for leaders, at NetSuite.com slash T-H-E-O.
00:55:39.300 That's NetSuite, N-E-T-S-U-I-T-E dot com slash Theo.
00:55:45.420 There's definitely been a lot of talk about like tech and governance, right?
00:55:51.400 And I know we touched on it a little bit earlier.
00:55:54.040 And there were people like lobbying and Trump had a big, beautiful bill for like a 10-year ban on state legislation against AI.
00:56:04.360 What do you think about that?
00:56:05.900 Like letting it be this rogue space?
00:56:08.440 There have to be some rules here.
00:56:11.540 There has to be some like guidelines.
00:56:13.220 There has to be some sort of regulation at some point.
00:56:15.240 I think it'd be a mistake to let each state do this kind of crazy patchwork of stuff.
00:56:20.220 I think like one countrywide approach would be much easier for us to be able to innovate and still like have some guardrails.
00:56:28.380 But there have to be some guardrails.
00:56:31.020 Have you met with governments and like government leaders to have discussions like that?
00:56:35.900 Like are they meeting with you?
00:56:36.940 Yeah, they do meet with us.
00:56:38.840 They haven't done anything big yet, but they're talking about it.
00:56:42.280 Do they meet with you to try to keep information out of you guys' data?
00:56:48.580 You know, for all of the paranoia about that, I don't think we've ever had someone come say like,
00:56:52.640 I don't want it to say this negative thing about this politician or this whatever.
00:56:57.800 The concerns are like, what is this going to do to our kids?
00:57:01.180 You know, are they going to stop learning?
00:57:02.360 There's a lot of concerns about that.
00:57:04.640 Is this going to spread fake information?
00:57:06.380 Is this going to influence elections?
00:57:08.280 But we've never had the like, you can't say bad things about the president, Trump or whatever.
00:57:15.080 Bias is a big thing.
00:57:15.940 Like they do want to know like, you know, if it'll say bad things about one candidate, it'll say bad things about the other.
00:57:20.840 Could you guys make it do one or the other?
00:57:22.620 Like, can you guys favor the back end?
00:57:24.760 Or like, we totally could.
00:57:26.020 I mean, we don't, but we totally could.
00:57:27.540 You could.
00:57:27.960 Yeah.
00:57:28.620 Wow.
00:57:29.180 Yeah.
00:57:29.480 I think like, I know you, how do you mean, like, do we give you guys lie detector tests?
00:57:34.220 Are we like, how do we know you're like test the system?
00:57:36.420 You can, anyone can like test the AI and say, if I say this, to say this, if I say that, but you, you touch on a really big point here, which is like hundreds of millions of people talk to chat.
00:57:46.080 LGBT every day and it probably has like a big impact on what they believe.
00:57:53.340 And so I think society's interest in making sure that we are, you know, a responsible, neutral party should be huge.
00:58:01.500 Now people do test it a lot and I think that's good, but like, we got to be held to a very high standard there.
00:58:06.180 But how do we like this as regular people or how do like regular people just hold you guys to a high standard?
00:58:11.120 Like, is it the, I guess it's politicians responsibility or, I mean, these guys are idiots.
00:58:15.420 Some of them are like 80 year old dudes giving thumbs up.
00:58:18.060 That one guy couldn't get the wifi on.
00:58:19.500 Remember that guy?
00:58:20.740 That guy couldn't get the wifi on.
00:58:22.660 So I'm like, how do we?
00:58:24.140 I mean, there's a huge amount of people that test our systems all the time looking for any errors, any bias, any, anything.
00:58:30.080 I guess that's a good point is we can tell this.
00:58:32.160 You can tell.
00:58:32.840 Yeah.
00:58:33.160 Right.
00:58:34.180 People can test it on this end.
00:58:38.220 As, as AI grows, like how big do data centers need to be?
00:58:42.120 Is that a concern of you guys?
00:58:43.620 I went recently to one of our new data centers under construction in Abilene, Texas.
00:58:51.460 It's about like a approximately one gigawatt facility.
00:58:54.240 Huge.
00:58:54.820 You know, it'll be the biggest data center ever built by the time it's done.
00:58:57.720 And you stand in the middle of that.
00:58:59.380 And the scale of this project just hits you so big.
00:59:04.000 That's like one little, that's like one little part of it.
00:59:06.460 Dude, that's like eight Costco.
00:59:07.420 Because, you know, there's like 5,000 people there doing construction on it.
00:59:11.020 And this thing is just standing up, making progress every day.
00:59:14.100 And you stand in the middle of this thing.
00:59:14.980 And what are you, in a chariot or whatever?
00:59:16.340 Like how do you even roll up?
00:59:17.360 You're like in a little ATV.
00:59:18.660 Oh, okay.
00:59:19.180 It's like a dirty kind of construction site.
00:59:22.960 But the scale of this thing.
00:59:26.040 And then you kind of go in every room and you look at all the cables, the power, the cooling systems, rack after rack of servers.
00:59:33.200 And it's humongous.
00:59:35.320 There's like, they're standing up these like power plants right in the middle of it.
00:59:38.240 Oh, yeah.
00:59:38.980 It's crazy to see.
00:59:40.100 It starts to make our planet look like a software board.
00:59:46.800 It does.
00:59:47.820 You know, when you see it from the air, I was really struck by that.
00:59:51.080 I was like, this looks like the motherboard of a computer.
00:59:53.080 Yeah, it looks like the motherboard of a computer.
00:59:54.220 You start to see like how the planets in like a lot of these like sci-fi movies, a lot of them have that R2-D2 look on the outside.
01:00:01.640 Yeah.
01:00:01.860 Because they've been.
01:00:02.620 Covered in data centers.
01:00:03.660 Yeah.
01:00:04.520 Which is kind of wild.
01:00:05.800 Do you know where we're going and you're not telling us?
01:00:08.000 Do you know what's happening?
01:00:08.520 I don't.
01:00:08.900 I don't.
01:00:09.380 You promise, dude.
01:00:10.320 I don't know.
01:00:10.880 I mean, I have all my guesses.
01:00:11.940 Like I do guess that a lot of the world gets covered in data centers over time.
01:00:15.480 Do you really?
01:00:15.900 But I don't know because maybe we put them in space.
01:00:18.260 Like maybe we build a big Dyson sphere on the solar system and say, hey, it actually makes no sense to put these on Earth.
01:00:23.580 Yeah.
01:00:24.020 I wish I had like more concrete answers for you.
01:00:26.640 But like we're stumbling through this.
01:00:28.320 We maybe, you know, have a little bit higher confidence than the average person or can, but there's so much we don't know yet.
01:00:34.640 No, that's the craziest thing about you, Sam.
01:00:36.780 And I think this is a compliment somehow, dear God.
01:00:40.240 And yeah, it is a compliment.
01:00:41.380 You're like – it's like you're like come with me through the universe and you're like – people are like, well, what's it like?
01:00:48.500 You're like, I don't know exactly, but – and then we're all – it's like we're all going.
01:00:52.980 It's like I don't know.
01:00:55.000 You're just somehow the most – like you're this – like this charming kind of Terminator, it feels like.
01:01:03.160 And I hate to say Terminator.
01:01:04.100 That's a crazy term.
01:01:05.300 But like – but you're this like – like I'm like, okay.
01:01:08.180 I'm curious.
01:01:09.600 You somehow seem so optimistic about it.
01:01:11.840 It adds to my curiosity.
01:01:14.140 When I was a kid, I assumed that there were always some adults in the room.
01:01:19.820 Someone had a plan.
01:01:21.360 Someone knew everything that was going to happen.
01:01:22.680 Someone had it all figured out and I sort of think why people like conspiracy theories is it's nice to think that someone's got a plan.
01:01:29.000 It's nice to think someone that, you know, has it all figured out.
01:01:33.540 And then I got a little bit older and I sort of started to suspect there are no adults in the room.
01:01:39.420 No one – people have plans.
01:01:40.880 I have plans.
01:01:41.840 But no one has all the answers.
01:01:43.280 No one knows where it's all going to go.
01:01:46.080 And now that I am the adult in the room, I can say with certainty, no one knows where it's all going to go.
01:01:50.080 Like I'm the guy in the room and I have some guesses and I have some plans and we're working really hard.
01:01:57.080 But like, you know, we try to always say what we think the possibilities are, what we think is most likely.
01:02:04.460 Often we're right.
01:02:06.140 Sometimes it's in the broader set.
01:02:07.300 And sometimes it goes in a totally different direction than anything we thought.
01:02:10.160 And, you know, we keep trying to make progress, figure out more.
01:02:15.520 We try to tell people – not just tell, we try to show people by like deploying these systems and say, hey, you can go use it.
01:02:20.940 Don't just take our word for it.
01:02:22.000 Try it out.
01:02:22.460 See what it can do.
01:02:23.360 Yeah.
01:02:23.460 But like I can say with conviction, the world needs a lot more processing power.
01:02:29.960 But if that looks like tiling data centers on Earth, which I think is what it looks like in the short term, in the long term also, or we do go build them in space.
01:02:37.140 I don't know.
01:02:37.560 It sounds cool to try to build them in space but also really hard.
01:02:40.800 What about like the environmental effects of those and stuff?
01:02:43.140 Like there's been like – you know, there's been articles written and I don't know how much of it is real or not real, right?
01:02:48.480 Because who knows what to believe.
01:02:50.340 But you'd have to think that, you know, it takes water to cool them, right?
01:02:53.720 It takes power to power them.
01:02:55.360 Yeah.
01:02:55.480 You know, there's some in like Arizona and Iowa that there's been like repercussions within the environments there in the communities.
01:03:03.140 And a lot of those companies don't have to report those things because it's considered proprietary, you know?
01:03:10.000 What do you think about those fears?
01:03:12.720 Or how do you guys manage that?
01:03:13.980 Like do you guys talk about that?
01:03:14.980 Do you meet with environmentalists?
01:03:16.320 Like what does that all look like?
01:03:17.340 I think we need to get to fusion as fast as possible.
01:03:19.640 Get to what?
01:03:20.160 Nuclear fusion.
01:03:21.580 I think that is the –
01:03:23.020 Oh, shit.
01:03:23.660 What is it?
01:03:24.500 Where you basically knock two small atoms together and it makes a bunch of energy but no carbon, very clean, doesn't generate – you know, doesn't really harm the environment.
01:03:33.040 And power can become like abundant and pretty limitless on Earth.
01:03:37.820 And we get out of all the current problems we're in.
01:03:40.260 Are you guys investing in that?
01:03:41.400 We are.
01:03:41.740 And I think AI can help us figure it out even faster.
01:03:43.940 So that's like a – you know, if you have to like burn a little bit more gas in the short term but you figure out, you know, the future of energy with that AI, it's a huge win.
01:03:52.640 And would you guys sell tickets to that or what do you think that would be like?
01:03:56.520 Yeah, I think we would just sell –
01:03:57.300 Because people are going to watch that shit.
01:03:58.340 I mean, yeah.
01:03:59.620 People go to monster trucks.
01:04:00.720 You don't think they'll roll up to watch those two things hit each other?
01:04:03.400 The atoms hit each other?
01:04:04.260 Yeah.
01:04:04.560 It's pretty hard to watch two atoms hit each other.
01:04:05.900 But maybe with the – you know, somehow we can do it.
01:04:07.800 Or what if they did like those sperm races where they would put them out of those big things or whatever?
01:04:10.940 I love those sperm races.
01:04:11.700 Wow, they're kind of crazy.
01:04:13.940 I'm like, dude, there's enough of that going on.
01:04:16.280 Look, I think the – yeah, there will be some way to watch Fusion and it will be awesome and it will be like loud and bright and theatrical and it will be making huge amounts of energy.
01:04:26.660 Even if you can't watch the two atoms hit, you'll watch them collectively produce fireworks.
01:04:32.900 But we're going to need that?
01:04:34.060 Do you think if we're going to get to –
01:04:35.380 I think so.
01:04:35.840 If we're going to get to AGI or if we're going to get to superintelligence, do we need that?
01:04:40.620 I bet we can get there without it.
01:04:42.580 But to provide it at the scale that humanity will demand it, I think we do need it.
01:04:47.340 Because people – the desire to use this stuff, people are just going to want more and more and more.
01:04:53.120 And eventually, like the two things that I think matter most, the two kind of critical inputs are intelligence and energy.
01:04:59.460 The ability to like have great ideas, come up with plans, and then energy is the ability to like make them happen in the world and also to run the intelligence.
01:05:07.020 And I think the story of the next couple of decades is going to be the demand for these goes up and up and up to crazy heights and we better find out how to produce a lot.
01:05:16.900 Otherwise, someone is going to feel like they're getting screwed.
01:05:19.840 Yeah.
01:05:21.720 Dang, dude.
01:05:22.300 I can't tell if I'm excited or scared.
01:05:24.220 Maybe I'm both and maybe it's all the same thing.
01:05:26.140 You have to be both.
01:05:26.840 You have to be both.
01:05:27.700 I don't know if it's the same thing or not.
01:05:28.680 I think it is kind of like they do feel related to me always.
01:05:33.920 But I don't think anyone could honestly look at the trajectory humanity is on and not feel both excited and scared.
01:05:42.160 Yeah.
01:05:43.600 And maybe that's always been the way throughout time.
01:05:45.700 And also then this is where we are.
01:05:47.380 What are you going to do?
01:05:49.240 You know?
01:05:50.020 Like this is where we are.
01:05:51.780 And so that's what's going on.
01:05:53.360 I saw where you and Joe Rogan spoke about there possibly being one day like an AI president, you know, where like what if you had this one kind of – let's just use the term supercomputer or this agent that was created that knew all the information and knew all of the problems and knew the best ways to solve them.
01:06:12.500 Is that – do you think that something like that is becoming more and more possible one day?
01:06:19.580 I don't know everything that it takes to be a president.
01:06:23.780 But I do know it like takes a lot of things that I don't have to do and that people are going to – well, maybe I could reframe it to an AI CEO of OpenAI because I do know what that job is like.
01:06:33.920 Okay.
01:06:34.080 That should be possible someday.
01:06:36.040 Maybe not even that far.
01:06:37.740 Like I think the idea to look at an organization, to make really good decisions – there's a lot of things you can imagine that an AI CEO of OpenAI could do that I can't.
01:06:45.860 I can't talk to every person at OpenAI every day.
01:06:48.220 I can't talk to every user of ChatGPT every day.
01:06:50.900 I cannot synthesize all that information even if I could.
01:06:54.000 But an AI CEO could do that.
01:06:56.100 And it would have better information, more context.
01:06:58.100 It could, you know, massively parallelize this.
01:07:00.320 And I think that would lead to better decisions in many cases.
01:07:02.300 Yeah, because wouldn't a supercomputer or something that has all knowledge, which you think will get there?
01:07:08.980 I do.
01:07:09.480 You do.
01:07:10.180 I mean, all knowledge is a hard thing to say.
01:07:12.920 I think it will have vast, vast amounts.
01:07:15.820 Will it be able to tell us about God or anything, do you think?
01:07:20.360 I'm super curious about that.
01:07:23.660 I think it will be able to help us answer questions about the nature of the universe that we currently can't.
01:07:28.800 And I feel very confused and very unsatisfied with our current answers.
01:07:33.440 And there is clearly, to me at least, something going on well beyond our current capability to understand.
01:07:39.440 And I would love to know what that is.
01:07:42.360 Do you think it could help us learn more?
01:07:44.500 Yes.
01:07:47.740 I wonder if God has a ChatGPT or whatever.
01:07:53.180 Or just wonder, he has the first one or whatever.
01:07:55.640 But, yeah, I'm just so curious, like, how would that work?
01:07:59.640 How does OpenAI make money?
01:08:02.760 We sell ChatGPT.
01:08:03.960 You pay $20 a month.
01:08:04.780 Some people pay $200, but very few, or relatively few.
01:08:07.740 Perverts, I think they are.
01:08:10.020 Mostly, hopefully, they're just working super hard and using it for being more productive at their job.
01:08:14.400 And then we also sell an API.
01:08:16.340 So businesses can use, and they, like, pay us every time they make an API call.
01:08:19.860 Okay.
01:08:22.360 Do you think, like, there's a lot of these, like, kind of tech lords that are rocking right now, right?
01:08:28.160 And you get thrown in there.
01:08:29.860 Sometimes.
01:08:30.560 I'm, like, on the periphery.
01:08:31.940 Yeah.
01:08:32.240 Or you get certainly, like, yeah, like these council, these councilmen kind of, like, do you think there's bad artists amongst, like, these tech lords in these AI realms?
01:08:42.420 Do you think there's bad artists out there?
01:08:43.660 What does bad artists mean?
01:08:44.420 Just, like, people that want for evil and not for good?
01:08:50.100 I think most people don't wake up.
01:08:53.980 I think very few people wake up every morning saying, I'm going to try to make the world a worse place.
01:08:59.520 Or I'm going to actively try to do evil.
01:09:01.460 Clearly, some do.
01:09:02.220 But I think most of these people running the big tech efforts are not in that category.
01:09:06.040 I think people get blinded by ambition.
01:09:08.180 I think people get blinded by competition.
01:09:09.720 I think people get caught up, like, very well-meaning people can get caught up in very negative incentives, negative for society as a whole.
01:09:20.500 And, by the way, I include us in this.
01:09:23.060 Like, we can totally get caught up and we can be very well-meaning but get caught up in some incentive and it can lead to a bad outcome.
01:09:30.880 So that's kind of what I would say.
01:09:32.420 I think people come in with good intentions.
01:09:34.120 They clearly sometimes do bad stuff.
01:09:35.520 There's a lot of talk about, like, Palantir and Peter Thiel and their company about being like a – you know, they got to deal with – from Trump about to have this surveillance – or not a surveillance state but to create a database on most of America.
01:09:53.080 But it starts to feel like a surveillance state, you know?
01:09:56.160 Do you feel like we will need something like that in order for the future?
01:10:04.720 You know, do you feel like something like that is included in the future?
01:10:08.160 So I don't know about that specifically.
01:10:10.100 I mean, I think Palantir and Peter do a lot of great stuff.
01:10:13.980 But, again, I can't comment on this specifically.
01:10:17.140 I'll say generally, I am worried that the more AI in the world we have, the more surveillance the world is going to want because the tool is so powerful.
01:10:27.640 The government will say, like, how do we know people aren't using it to make bombs or bioweapons or whatever?
01:10:32.240 And the answer will be more surveillance.
01:10:34.860 And I'm very afraid of that.
01:10:36.700 So I don't – I think we really have to defend rights to privacy.
01:10:44.000 I don't think those are absolute.
01:10:45.040 I'm, like, totally willing to compromise some privacy for collective safety.
01:10:49.600 But history is that the government takes that way too far.
01:10:52.740 And I'm really nervous about that.
01:10:54.700 Do you guys feel like the new government kind of or do you feel like the government is still like a real thing?
01:10:59.940 I don't feel like the government anyway.
01:11:01.580 You don't?
01:11:02.500 When the U.S. government bombed Iran recently, I remember waking up that morning and seeing that news or whatever time it was.
01:11:10.920 And I was, like, oh, that's what actual power looks like.
01:11:18.380 You know, we're in, like, a – maybe someday we get there.
01:11:21.340 But it was, like, a really stark reminder of however important we think this is.
01:11:24.060 It's, like, there are people that have just, like, this unimaginable power and might and can kind of do whatever they want.
01:11:30.440 And that's definitely not us.
01:11:31.700 Yeah.
01:11:32.620 Yeah, I think that's been a lot in the Middle East recently is just, like – it's just such a gross displays over there sometimes of inhumanity.
01:11:41.420 Absolutely.
01:11:41.900 It's sad.
01:11:43.700 What do you think a guy like then, like, Palantir or Peter Thiel's endgame is?
01:11:47.180 Do you think he has an endgame?
01:11:48.620 Because I think he seems like a dark lord to a lot of us.
01:11:51.160 And it's, like, does he – you think he has an endgame that is, like, happy?
01:11:56.300 I think Peter is one of the most brilliant people I've ever met.
01:11:59.420 Oh, he's smarter than me.
01:12:00.500 That's for sure.
01:12:01.400 I think he does get caricaturized in the media as this, like, evil mastermind.
01:12:07.160 As a villain, he does.
01:12:08.200 I never met him.
01:12:09.020 I met him.
01:12:09.600 We're very close friends.
01:12:11.600 I –
01:12:12.080 I should have brought it up then.
01:12:13.500 No, it's all good.
01:12:14.160 No, no, no, no.
01:12:14.720 It's all good.
01:12:15.780 I don't feel that energy from him, but I – at all, like, I – in fact, I think he's been one of the most important forces,
01:12:27.240 at least in my life, for questioning assumptions about the path that society was on.
01:12:32.980 And maybe I was like, oh, I thought this was all going well, but maybe we are in a tech stagnation.
01:12:37.420 Maybe we really do have this huge economic challenge that no one's talking about.
01:12:41.160 And so I think these people who are just very – that think very differently, he would call it very contrarian, is super important to a society.
01:12:54.800 Now, on the other hand, you know, maybe he – maybe he sometimes does things like this that don't do him any favors.
01:13:08.480 What is it?
01:13:09.380 You would prefer the human race to endure, right?
01:13:13.360 You're hesitating.
01:13:14.660 Well, I –
01:13:15.260 Yes?
01:13:15.700 I don't know.
01:13:16.120 I would – I would –
01:13:19.800 This is a long hesitation.
01:13:22.140 This is a long hesitation.
01:13:23.160 There's so many questions implausible in this.
01:13:24.900 Should the human race survive?
01:13:29.220 Yes.
01:13:29.800 Okay.
01:13:30.120 But –
01:13:31.260 God.
01:13:32.280 I mean, that was 22 seconds it took him.
01:13:34.000 Yeah.
01:13:34.700 So if he were maybe like a more typical person, he would have just said an immediate yes.
01:13:41.900 Right.
01:13:42.200 And then said what else he wanted to say.
01:13:44.240 And it took me a while with him to understand that his brain just works differently.
01:13:48.380 And society needs some of that.
01:13:50.620 Like he has these super different takes and then he doesn't have maybe the circuit in his brain that makes him immediately say yes and then say what he was going to say.
01:13:58.820 But, you know –
01:13:59.120 Maybe his processors.
01:14:00.140 Yeah.
01:14:00.400 I'm very grateful he exists because he thinks of things no one else does.
01:14:03.980 Yeah.
01:14:04.220 You know – yeah.
01:14:05.960 You want – they're – novel thinkers have changed things throughout time, sometimes for the better and sometimes for the worse, sometimes for the indifferent.
01:14:14.100 But novel thinkers have – you've always like – I don't know.
01:14:19.300 It's always been part of humanity.
01:14:21.260 I'm probably super different and super weird relative to most people.
01:14:24.800 But, you know, maybe I have some ideas as part of that that are like valuable to society collectively.
01:14:29.360 And if I had this sort of very standard mindset, I wouldn't.
01:14:32.160 That's a good point.
01:14:33.360 Yeah.
01:14:33.600 Well, do you think – and I'm just going to ask you real honestly.
01:14:36.240 Do you think a lot of these guys have – I mean, you know, it's not like, you know, love on the spectrum is like a big show, right?
01:14:44.340 People – you know, it's like – and those people are in love shit.
01:14:47.140 Half the people I know are just, you know, barely – you know, they're crying in parking lots or whatever.
01:14:52.000 But, you know, their spousal issues or whatever.
01:14:54.580 But anyway, what I'm saying is do you think that some of the creators now and some of the tech lords are – almost have some tech built into them?
01:15:03.840 Like almost a – I don't want to say like an autism dude because –
01:15:07.100 You can say that.
01:15:07.720 Okay.
01:15:08.140 I think so.
01:15:09.080 I mean, yeah.
01:15:11.060 I – you know, to take the kind of like harshest look at us collectively, I can.
01:15:15.500 You know, are we a little autistic on the whole?
01:15:17.500 I would say probably.
01:15:18.740 I knew that shit.
01:15:19.740 That's all right.
01:15:20.420 No, no.
01:15:20.960 That's what I'm saying.
01:15:21.400 And for years ago, I was – first time I ever met some people with autism, I was like, dude, these guys are computers, right?
01:15:27.560 Like a lot of these guys are just – you know, they're some – they're kind of like a little bit of a cyborg in some way in the way that they think, right?
01:15:34.720 You know, look, I'm – you are this like impossibly charming cool guy and I'm like kind of a lot more computery than you.
01:15:40.460 You're not much though.
01:15:41.280 We can have it.
01:15:41.860 We can still like figure it out.
01:15:43.860 Oh, yeah.
01:15:44.520 And I really don't mean it as an offense, but I think that we may need that in people to get whatever's next in the world.
01:15:50.460 Do you think that's realistic?
01:15:52.100 Yeah.
01:15:52.400 I think society needs like this very broad diversity of people.
01:15:56.840 You need some people like me.
01:15:57.840 You need some people who are more normal than me.
01:15:59.560 You don't want too many of me, but like –
01:16:01.460 Yeah.
01:16:02.260 Yeah, you don't want too many of any one thing.
01:16:03.940 Yeah.
01:16:04.100 Yeah.
01:16:05.020 Yeah, I'm just always – I'm like, God, yeah, these people are able to see things differently and quantify things differently.
01:16:10.140 Do you always feel – because some tech guys, they just have a different understanding of possibility, right?
01:16:17.440 A different understanding of feeling and thing.
01:16:19.740 Do you feel human all the time?
01:16:22.040 I do feel human all the time, but I feel like I have noticed that I think extremely differently about the future, about exponential change, about compounding technology than almost anybody else that I kind of come across in regular life.
01:16:36.240 So –
01:16:36.740 That's cool.
01:16:37.940 I feel extremely human.
01:16:39.140 And I feel like, you know, driven by crazy emotions as much as anybody.
01:16:42.640 But I am like very aware that I have a different lens than a lot of people.
01:16:47.680 Have you met some people in tech space and you're like, whoa, that guy is only like 6% or 7%.
01:16:51.200 He's low.
01:16:52.040 Not a lot of human in him.
01:16:53.400 Yes.
01:16:55.100 Yeah.
01:16:56.400 Okay.
01:16:59.680 Do you think it's inevitable that AI or AGI will merge into our bodies?
01:17:03.900 I know you've talked about this before in the past.
01:17:06.220 As things go along and advance quickly, do you start to see that a little bit differently?
01:17:10.000 I know you've talked about how you don't think it's like a glasses thing or something like that.
01:17:13.300 I'll tell you a fascinating story.
01:17:14.900 Okay.
01:17:15.360 I was with a friend last week.
01:17:17.240 And did I offend you by asking that?
01:17:18.360 Not at all.
01:17:18.840 Okay.
01:17:19.100 Zero percent.
01:17:19.600 I thought that was a great answer.
01:17:20.720 I really appreciate it.
01:17:21.880 Because, yes, some of us are – we can't conceptualize sometimes how you guys are thinking.
01:17:25.740 It can't – I can't even like – we feel like we can't figure it out, you know?
01:17:29.540 So it feels like it's almost like a unique – it's like are we all evolving into this new kind of species and that's where we meet the future at anyway?
01:17:37.920 And you're just like the dang Paul Revere out there, you know?
01:17:40.040 It's like –
01:17:40.460 For better or for worse, I think whenever you see someone who thinks differently than you,
01:17:44.700 it's like – like I'm fascinated by you.
01:17:47.060 I don't quite understand how you do your thing.
01:17:48.840 I know I couldn't do it.
01:17:49.720 I know you like just understand the world differently than me.
01:17:52.820 But I think that's cool.
01:17:54.220 And I'm just like, all right, I'm glad.
01:17:55.340 Yeah, that's how I feel.
01:17:56.260 Yeah.
01:17:56.680 I think it's just – thanks for just talking to me about it because sometimes I think I get afraid to say –
01:18:00.780 I don't think you should be afraid.
01:18:02.240 I don't think anybody would be offended by that.
01:18:04.240 I was talking to this friend of mine though about how he uses ChatGPT and he's been using it a lot for a couple years now.
01:18:11.840 And he noticed recently that he started giving it personality tests.
01:18:15.680 He had uploaded any personality test he could find to ChatGPT and say, based on what you know about me, answer this.
01:18:22.320 And he had never like told it, here's my personality.
01:18:25.920 It had just learned it from the questions he asked over the years.
01:18:28.560 And on every one he tried, it got exactly the answer and exactly the outcome he would get.
01:18:35.380 And so that's not like – he didn't get uploaded.
01:18:38.080 He didn't get merged.
01:18:38.960 He didn't plug something into his brain.
01:18:40.820 But somehow like the pattern of him had gotten imprinted into this AI.
01:18:46.040 Wow.
01:18:47.280 Maybe we're not as complex as we think we are.
01:18:50.260 Or maybe we are and AI can just learn it really well.
01:18:54.460 AI can like represent these very complex things.
01:18:56.700 One of those two.
01:18:57.380 But that was a real moment for me of like, wow, the merge maybe can happen in a very different way than we thought.
01:19:04.100 Yeah.
01:19:05.040 Yeah, because you think of it as this thing kind of taking over your system and like your dad presses a button and you can't use the car.
01:19:11.340 You can't move for a month or whatever.
01:19:13.620 Yeah, I think it kind of has that sort of energy.
01:19:15.880 You just finished the acquisition of – this is a little bit more like day-to-day business.
01:19:23.640 You just finished the acquisition of Johnny Ives, a hardware company, their hardware company.
01:19:31.980 Yeah.
01:19:32.100 So clearly you have some like thoughts or interest in how like hardware and AI match up for each other in humanity.
01:19:40.960 What was that about?
01:19:42.880 There have been two revolutions in computers in history.
01:19:45.460 There was the keyboard, mouse, and screen, that thing that was invented down the street in I think the 70s where, you know, the people at Xerox Park figured out what has become the modern computer interface.
01:19:55.860 And then in the early 2000s, I guess, Apple figured out this idea of touch on a device.
01:20:03.980 And really, those have been the two big ones.
01:20:07.300 I think now there can be a third.
01:20:09.020 I think AI is – it so changes the game that you can design a new kind of computer based off of a really smart AI where you can give a complex instruction to a system.
01:20:20.320 It can go do it.
01:20:20.900 You'll trust that it gets it right.
01:20:22.100 You'll trust it to act on your behalf.
01:20:23.560 It could like maybe be aware of everything going on in this room and it could like kind of not just be on or off but like lightly get our attention if it wants us to know something or maybe more aggressively get our attention.
01:20:33.020 It could really be like following what we're talking about here and remind us both of things later.
01:20:38.420 And current hardware just can't do that.
01:20:41.140 The current kind of computers we have I don't think are a fair – they don't honor what the technology is not really capable of.
01:20:49.140 So I want to make a totally new kind of computer that is meant for this world of AI.
01:20:53.560 I'm helping you all the time.
01:20:55.100 I'm super excited about it.
01:20:56.220 You are?
01:20:56.800 Yeah.
01:21:00.460 You guys – there's this thing called agent that you guys just showed me earlier.
01:21:03.800 I can take this out if I mention it.
01:21:05.020 No, no.
01:21:05.080 It's all good.
01:21:05.320 I was supposed to.
01:21:05.980 It was pretty fascinating.
01:21:07.740 It was cool to see.
01:21:08.540 It is.
01:21:09.140 Yeah.
01:21:09.440 This is a new thing that we just did.
01:21:11.020 But the idea that an AI cannot just answer questions for you but it can go actually do stuff on your behalf as your agent.
01:21:19.620 It can go do research for you.
01:21:21.460 It can go book something for you.
01:21:23.380 It can go buy something for you.
01:21:24.320 It can go like change some things in the world for you and think more and use tools.
01:21:28.260 Like I think most people think of ChatGPT as this app that you can ask anything but it will become this thing that can do anything and that will change how you use computers.
01:21:39.380 It will change how you do things in your life.
01:21:41.660 Yeah.
01:21:42.060 I was watching the guy do it and it was just kind of fascinating.
01:21:45.060 He was showing like one time he went to like a website and bought something that he needed and then now moving forward he could just be like, hey, go to this and make sure to get me these or go to – go here and see – go to the restaurants I like and see if there's any table available for 7 p.m. tomorrow.
01:21:59.620 And it was able to book it and do everything.
01:22:01.820 It was like having a secretary right there.
01:22:03.660 Totally.
01:22:03.980 When I first started using it, I was like – it was one of those moments where I could tell that, oh, man, doing this the old-fashioned way is going to feel like the Stone Age so quickly.
01:22:13.920 You know, I'm going to like try to tell people someday like, do you remember when if we wanted to do something, we actually had to go like click around the internet and like, you know, look for a table and then if we wanted to move it, we had to like call the restaurant and that's going to be unimaginable because of course you just tell your AI to do those things for you.
01:22:30.860 Yeah.
01:22:31.220 Yeah, you feel like you would almost just tell it to go eat too, you know.
01:22:35.360 That's the fun part.
01:22:36.340 Yeah.
01:22:36.600 Oh, yeah.
01:22:36.920 No one likes booking the table.
01:22:37.920 Everyone loves sitting there eating.
01:22:39.080 That's a good point, huh?
01:22:40.120 Yeah, yeah.
01:22:40.540 It won't take away the fun part.
01:22:41.840 That's the thing.
01:22:43.020 I think you got to remember that.
01:22:44.120 It won't take away the fun part.
01:22:45.160 You're going to do the things you want to do.
01:22:46.400 There's a lot of things in your life you probably don't love doing, like booking an open table is maybe one of them.
01:22:50.680 Yeah.
01:22:51.000 And then you'll have like old-fashioned be like, oh, I'll book it, you know.
01:22:54.020 You're like, Dad, what do you mean?
01:22:55.200 Get off the phone or whatever.
01:22:56.080 Don't call him, you freaking weirdo.
01:22:58.080 Use a freaking, use your agent.
01:22:59.660 Totally.
01:23:00.380 Like, oh, I'll book it.
01:23:03.260 There's like a lot of like, you know, Zuckerberg recently like kind of was poaching guys around town, right?
01:23:09.800 And I'll say it.
01:23:10.440 You don't have to say it.
01:23:11.420 Allegedly.
01:23:11.880 I'm not saying he did.
01:23:13.080 He hired one of my buddies.
01:23:14.460 But what I'm saying is there's this hypothetical that he was like kind of poaching guys around town.
01:23:19.620 Yeah.
01:23:19.740 Is that, did that feel like a mafioso move in the community?
01:23:24.080 What was that like out here on, out here in the tech trenches?
01:23:27.140 I mean, you know, they want to get into the AI game.
01:23:32.740 I understand it.
01:23:34.640 So, and if he's going to do this, he needs to hire some people.
01:23:37.820 So bring it.
01:23:38.280 So bring it.
01:23:38.900 So bring it.
01:23:39.440 Yeah.
01:23:41.060 Fuck yeah, dude.
01:23:42.300 I'm going to upload myself into this plant in a second.
01:23:45.640 Okay.
01:23:46.100 No.
01:23:46.740 But no, does it, do you kind of like the competition?
01:23:48.760 Is that fun?
01:23:49.680 It is.
01:23:50.120 It's like winning is fun.
01:23:51.920 Yeah.
01:23:52.140 And I expect to win.
01:23:53.740 And you got to love the comment.
01:23:54.900 That's part of it, right?
01:23:55.640 It makes it fun.
01:23:57.740 I think what it would be like if we didn't have competition and drama in the world.
01:24:03.620 It'd be so boring.
01:24:05.480 Could, uh.
01:24:06.420 Actually, can I say one more thing about that?
01:24:07.740 Sure.
01:24:08.040 The best improvement I made in my life, in my, like personally in my life and for my own happiness over the last couple of years, a lot of bad shit has happened to us.
01:24:15.440 To me, it's been like a crazy intense experience.
01:24:18.440 And I just decided that I was going to like learn to love the hard parts.
01:24:22.960 I was like, you know what?
01:24:24.080 If I'm in this crazy moment, if I'm in this like crazy thing, if I like feel my emotions are high, I'm going to like make myself learn to be grateful for that, to love it, to find enjoyment in the, in the tension, in the competition, whatever.
01:24:35.580 And actually it worked.
01:24:37.280 And it, it kind of needed to work because like so many things go wrong in any given day.
01:24:42.040 But I was like thinking about, you know, someday I'll be like retired on my ranch.
01:24:45.700 I'll be sitting there watching the plants grow and I'll be missing the excitement and the drama and the anger and the tension and the whatever.
01:24:52.120 And so I'm going to be like grateful for it and like learn to have fun with it.
01:24:55.500 And now it, like, I cannot believe that that mind shift, mindset shift worked, but it did.
01:25:02.660 And were there practices like in a moment, like say like a moment came up, like some of the early ones, right?
01:25:07.020 Because I agree with you that like having some mindset, like I used to hate traveling, like every week traveling for work.
01:25:12.260 But then one day I was like, dude, you have to travel for work.
01:25:15.560 Deal with it.
01:25:16.060 You may as well.
01:25:16.740 Right.
01:25:17.040 You may as well.
01:25:17.980 Because for years you've been, and right there, suddenly it wasn't bad anymore.
01:25:22.180 That happened for me too.
01:25:23.280 Was there like a, just a practice or was it just this verbal reminder?
01:25:26.160 Like I'm going to do this.
01:25:26.880 I just kept saying to myself, I was just like, someday you'll miss these moments.
01:25:31.380 You may as well find a way to like find the happiness and kind of great gratitude for them in the moment.
01:25:38.220 Yeah.
01:25:39.840 Um, a lot of these guys have bunkers.
01:25:42.300 Zucky has a bunkie.
01:25:43.260 I know that somewhere out in Hawaii, people have bunkers.
01:25:47.560 Do you have a bunker?
01:25:48.740 I have like underground concrete, heavy reinforced basements, but I don't have anything I would call a bunker.
01:25:54.860 Hold on, hold on, hold on, hold on, dude.
01:25:57.340 Look, I'll let you, I'll let you keep me on the ropes in a lot of this conversation, but I am going to call that out as a dang bunker, dude.
01:26:03.340 Sam, that's a bunker.
01:26:05.380 What's the difference between a basement and a bunker?
01:26:07.320 A place you could hide when it all goes off or whatever.
01:26:10.460 I know.
01:26:10.960 Yeah.
01:26:11.140 I have been thinking I should really do a good version of one of those, but I don't, I don't have like a, I don't have what I would call a bunker, but it has been on my mind.
01:26:18.740 Not because of AI, but just because of like people are dropping bombs in the world again.
01:26:23.560 And then, you know, like.
01:26:24.980 That's a good point.
01:26:26.200 That's a very good point.
01:26:27.220 Yeah.
01:26:27.440 Basin right there.
01:26:28.060 Part of a house building typically used for storage, laundry, extra living space or utilities.
01:26:33.520 And then bunker built for protection, often military or emergency related, meant to withstand explosions.
01:26:40.440 We don't have that yet.
01:26:41.300 Do you guys do this just for me or do you use ChatGPT as the fact check?
01:26:43.920 Like, we did this just for you.
01:26:45.860 I appreciate it.
01:26:47.260 This is nice.
01:26:48.300 If, could we, could we ever have instead of, so you start to see, say if AI comes over and there's this whole new kind of like, you know, I believe that one of the things that's been happening, there's been like a lot of like ice raids and people getting like taken out of their homes.
01:27:00.800 And, you know, there's been a lot of crackdown because part of me believes that they're having to get everybody documented or online basically because they're going to start to have this like this like facial recognition everywhere.
01:27:19.280 Like I have this idea of that.
01:27:20.580 So, yes, this stuff had to happen because in a year or year and a half, you wouldn't even be able to be outdoors anywhere, anywhere without a drone or something noticing you or some camera noticing that you're not supposed to be there or you're not there with documentation, right?
01:27:34.160 Whatever people's thoughts are on that.
01:27:36.120 But just so part of me starts to see like, oh, okay, that's going on.
01:27:40.020 Do you think we could ever then down the line have new countries like delineated by like almost like a new AI landscape?
01:27:48.580 Like remember when on Snapchat, if you were in a certain realm, you could put like a filter on something and they almost created these new like geo barriers and stuff.
01:27:57.820 Do you think we could potentially be looking at something like that one day?
01:28:01.820 I know that what you just said is going to happen.
01:28:04.200 I know that we're going to have like cameras on, you know, all over the place and it's going to make the cities way safer because everybody like if you commit a crime, they'll have like a facial recognition hit on you right away.
01:28:15.320 But man, do I find that dystopic.
01:28:17.260 Like, like you do, of course, like, you know, is it like a good trade if it means like people stop getting murdered in the streets?
01:28:25.380 Yeah, sure.
01:28:26.040 We agree to like give up some privacy for that.
01:28:28.280 But it it sits so uncomfortably with me.
01:28:31.800 You know, in like London or whatever, you see those cameras on every street corner.
01:28:34.980 And you're just like, you get used to it fast.
01:28:38.800 Yeah.
01:28:39.060 But you're just like, oh, it feels like privacy is important.
01:28:43.720 And and like you, you really are like, there's nothing I can do to live in the world and avoid all these cameras.
01:28:50.500 And maybe it's worth it for society collectively.
01:28:53.000 But it.
01:28:53.960 It it it feels like we really do give up a lot to get it.
01:29:01.280 But could there one day you think if we had that, then we could have whole new countries kind of that were.
01:29:06.380 What do you mean by new countries in this case?
01:29:08.040 Like, say, if there was this new kind of this new like layer, right, of a surveillance layer that's kind of in the in the air, then could that be divided into different realms?
01:29:22.980 Oh, yes, totally.
01:29:23.960 That can I think there's all kinds of weird ways that that can happen.
01:29:27.800 But.
01:29:29.280 But the surveillance layer is so uncomfortable.
01:29:31.420 Oh, yeah.
01:29:31.940 It's going to be a nasty blanket.
01:29:33.300 Is there anything else that you wanted to talk about?
01:29:36.680 You wanted to get out that you want me to ask you about?
01:29:38.940 No, that was great.
01:29:40.760 Oh, why are there?
01:29:42.580 Why does Chad UBT have that hyphen thing?
01:29:44.980 We got to do something about that.
01:29:48.520 You know, we have this team that figures out what the model's personality should be like and how it should behave.
01:29:55.120 And a lot of users like M dashes.
01:29:57.380 So we had more M dashes.
01:29:59.040 And now I think we have too many M dashes.
01:30:00.580 But that's the answer is it was just like users liked it.
01:30:03.460 We put more in.
01:30:04.640 Now it's like a little bit of a meme.
01:30:06.020 And it's kind of it's quite annoying to me.
01:30:07.480 We should we should fix that.
01:30:08.640 But you're thinking about it, too.
01:30:09.580 I think we'll get it fixed.
01:30:10.380 Okay.
01:30:12.760 Before you go, Sam, and thank you so much for your time today.
01:30:14.820 It's been awesome.
01:30:15.260 We appreciate it, man.
01:30:17.120 It's helped me get to understand you.
01:30:18.540 I feel like a lot.
01:30:19.560 I think maybe differently than I.
01:30:21.860 I don't know if I had a perception.
01:30:23.060 I didn't know what to think.
01:30:23.700 What's the before and after?
01:30:24.540 The before was like a little bit like I guess I almost thought kind of like not as hopeful.
01:30:36.900 But I don't know why.
01:30:38.100 Maybe that's just my own.
01:30:39.140 I think it's attaching my own perceptions of what I think about AI and stuff or the possibilities of technology, you know, like that kind of stuff, like that curmudgeoning energy.
01:30:47.460 I think I was probably attaching it to you.
01:30:49.260 And now I feel like I'm more whimsical about it, kind of like or not whimsical, but like let's see what can happen.
01:30:58.180 Right.
01:30:58.740 And so I think it's not just let's see.
01:31:00.160 It's like let's try to make it good.
01:31:01.600 But let's realize that you have to like you don't get to see all the way down the road.
01:31:05.360 You kind of got to go one turn at a time and you like light up a little bit more.
01:31:08.480 Yeah.
01:31:09.420 Yeah, I think.
01:31:10.200 Yeah, I don't know.
01:31:10.960 I just I'm really I'm really thankful for you.
01:31:12.840 You even let me tell you what I thought of what I was like judging and then and then sharing like kind of where I thought what I thought now in 20 years.
01:31:20.540 What do you hope your legacy will be?
01:31:24.320 You're going to have one.
01:31:25.400 I mean, yeah, I guess I certainly don't.
01:31:29.120 Only anyone sits around while they're in like the middle of the game thinking about, you know, what the review is going to be after.
01:31:35.220 At least I don't.
01:31:37.120 And.
01:31:38.740 But this is a big review you'll have.
01:31:40.760 I have never been that motivated by like what like I want to like play the game the best I can.
01:31:48.820 I want to like, you know.
01:31:51.340 Do the best work I can have the most fun to like have the most impact of the most interesting stuff.
01:31:56.460 But then, you know, you retire and then you die and then like life goes on and people as they're supposed to go on with life and forget about you.
01:32:03.920 And this whole thing of like I'm going to live for high and remembered after I die and my legacy and like you're dead.
01:32:11.040 You know, do you have one of those deals where you save in your heart with those people?
01:32:14.860 What do you mean your brain?
01:32:15.960 Sorry with the people over there.
01:32:18.760 Cryonics.
01:32:19.240 You have a cryonic deal?
01:32:20.340 No, I.
01:32:21.820 Have you been approached about it?
01:32:23.140 I have been approached by it.
01:32:24.140 There was like a there was this like Y Combinator company that I like helped out a long time ago by like giving some small deposit.
01:32:34.320 And then like I never followed up on it.
01:32:35.980 So I don't have anything in place.
01:32:37.200 OK.
01:32:38.420 But maybe.
01:32:39.120 Yeah.
01:32:39.600 Maybe just a down payment somewhere down there.
01:32:41.520 Things get weird.
01:32:42.580 We'll go knock on their door.
01:32:44.720 Yeah.
01:32:45.160 But thank you so much, man.
01:32:46.000 James Becerra says hello.
01:32:47.080 He's a friend of mine.
01:32:47.940 He's a great guy.
01:32:49.240 And we just appreciate you so much.
01:32:51.140 Sam, thanks for your time.
01:32:52.040 Thanks for doing this.
01:32:52.840 I really enjoyed it.
01:32:53.380 Thank you for your time today.
01:32:54.060 I thought it was very informative.
01:32:55.180 Now I'm just floating on the breeze and I feel I'm falling like these leaves.
01:33:01.420 I must be cornerstone.
01:33:06.520 Oh, but when I reach that ground, I'll share this peace of mind I found.
01:33:12.140 I can feel it in my bones.
01:33:16.740 But it's going to tell you.
01:33:18.560 Thank you.