This Past Weekend with Theo Von - April 28, 2025


#579 - Mark Zuckerberg


Episode Stats

Length

1 hour and 39 minutes

Words per Minute

200.11678

Word Count

19,880

Sentence Count

1,626

Misogynist Sentences

31

Hate Speech Sentences

22


Summary

This week's guest is an entrepreneur, an inventor, and philanthropist. Mark Zuckerberg is one of the richest men in the world, and a consensus of wealth and power exists in only so few people. He co-founded Facebook in 2004 when he was 19 years old, and he founded the company that turned into Meta, which is where we are today in their headquarters. I m thankful to spend time and get to know Mark Zuckerberg.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.300 Rocky's vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.320 Sponsored by Bell. Conditions apply.
00:00:28.560 See AirCanada.com.
00:00:30.000 I have some tour dates to tell you about.
00:00:31.920 Miami, Florida on May 10th.
00:00:33.660 Cedar Rapids, Iowa on June 19th.
00:00:37.280 St. Paul, Minnesota on June 20th.
00:00:39.400 Fargo, North Dakota on June 21st.
00:00:42.940 Rapid City, South Dakota on June 22nd.
00:00:46.160 Winnipeg and Calgary in the Canada.
00:00:49.120 All tickets at Theovan.com slash T-O-U-R.
00:00:54.720 Please go through those links so you get accurate pricing.
00:00:57.940 And I appreciate your support for the return of the Rat Tour.
00:01:01.540 Today's guest is an entrepreneur.
00:01:03.560 He's an inventor.
00:01:04.460 He's a philanthropist.
00:01:07.200 He is one of the richest men in the world.
00:01:11.700 And a consensus of wealth and power exists in only so few people.
00:01:19.280 He co-founded Facebook in 2004 when he was 19 years old.
00:01:23.500 The company that turned into Meta, which is where we are today in their headquarters.
00:01:28.980 I'm thankful to spend time and get to know Mr. Mark Zuckerberg.
00:01:34.580 Shine on me, and I will find a song I've been singing.
00:01:43.700 I'm on this guy.
00:01:50.540 You drink coffee, man, or no?
00:01:54.480 Nah.
00:01:56.960 Really?
00:01:58.320 Yeah.
00:01:58.760 I mean, you've had it.
00:02:01.160 I have.
00:02:03.400 Sometimes on vacation, I'll drink it recreationally.
00:02:06.140 It's like every once in a while.
00:02:08.500 Just like a celebration?
00:02:11.360 Yeah, no.
00:02:12.180 Really?
00:02:13.140 Yeah, no.
00:02:14.120 I just like hate anything that messes with it.
00:02:17.380 I don't like any kind of chemicals or anything like that.
00:02:20.120 Oh, really?
00:02:20.600 So you like to keep everything, the equilibrium?
00:02:23.220 Yeah.
00:02:23.460 My sister gives me such a hard time about this.
00:02:25.080 She's like, you're just sitting there raw-dogging reality.
00:02:27.900 Wow.
00:02:29.740 It's kind of true.
00:02:30.900 Like, so you, but have you, you've had it before?
00:02:33.080 Yeah.
00:02:33.380 But you just don't like it?
00:02:34.060 Yeah, yeah, yeah.
00:02:34.260 I don't like it.
00:02:34.920 I don't like it.
00:02:35.220 So when you get up in the morning, that's not your thing?
00:02:36.800 Like, is there something you do?
00:02:37.800 No, no, no.
00:02:38.260 Yeah.
00:02:38.480 I mean, I wake up and I fight people.
00:02:40.640 Yeah.
00:02:44.280 Yeah.
00:02:44.700 No, I mean, I wake up in the morning.
00:02:46.680 And are we going, by the way?
00:02:48.360 I mean, we should, we should get this.
00:02:49.700 Yeah, we're going.
00:02:50.100 What's up?
00:02:50.540 All right.
00:02:50.740 Oh, you mean you wake up and do jujitsu, you mean?
00:02:53.340 Oh, yeah.
00:02:53.720 Yeah, so I probably wake up, like, I don't know, 7, 7.30.
00:02:57.680 Whenever, like, the kids start making noise around the house, it's like, all right, sleep
00:03:01.680 is done.
00:03:02.300 Yeah.
00:03:02.480 And then, like, it's like, I look at my phone, and I'm just like, all these things
00:03:09.760 that these people are doing, like, you did what?
00:03:11.800 Are you fucking kidding me?
00:03:12.880 It's like, I have to go fucking deal with this.
00:03:14.440 It's like, like, it's like, this partner, are you really?
00:03:17.500 God damn it.
00:03:18.380 All right, so, and then it's like, I compose myself and go fight for two hours, like, recenter
00:03:25.440 myself.
00:03:26.300 Then it's like, now I can go deal with this stuff.
00:03:28.420 But no, it's, uh.
00:03:29.680 So that helps.
00:03:30.300 It's almost like your coffee in the morning sometimes, like, rolling, like, rolling jujitsu
00:03:33.960 kind of.
00:03:34.300 Yeah, and I mean, right now I'm doing more striking.
00:03:37.260 So, I mean, that's really fun.
00:03:38.440 I just, I think it's like the greatest sport.
00:03:40.280 I mean, it's, um, it's like neurologically stimulating.
00:03:43.360 It's, uh, you know, it's good cardio, good strength.
00:03:46.760 Oh, yeah.
00:03:48.140 A little bit of a threat, right?
00:03:49.620 So it keeps you on, you know, it's not, not like just like running.
00:03:51.700 I used to like run around the neighborhood, but running is not that thrilling.
00:03:55.500 Running compared to jujitsu is for running is a, running is not really neat once you
00:04:02.960 can do jujitsu.
00:04:04.680 Because I think one cool thing about jujitsu is just like, you can lose a match with somebody,
00:04:10.200 right?
00:04:10.340 You can lose like, um, like they can submit you, but you'll learn something along the
00:04:15.400 way.
00:04:15.680 Right.
00:04:16.400 And they, a lot of times the guy submitting you also wants to help you learn too.
00:04:21.020 So it's like, you can lose and win at the same time.
00:04:24.400 I think that's, what's kind of masterful about it.
00:04:26.340 Yeah, totally.
00:04:26.900 Do you, do you do it?
00:04:27.720 I don't do it as much as I would like to, you know?
00:04:29.480 Yeah.
00:04:29.840 Yeah.
00:04:30.200 Yeah.
00:04:30.600 Striking.
00:04:31.440 Yeah.
00:04:31.920 Yeah.
00:04:32.240 No, I never got into striking, but I would just kind of like, um, but I would just do
00:04:35.800 jujitsu on the mats, you know?
00:04:37.200 Yeah.
00:04:37.560 No, it's so fun.
00:04:38.400 It's just like fun to do with friends.
00:04:40.080 Oh yeah.
00:04:40.660 And some guy chokes you so hard.
00:04:42.560 And then you're just like, uh, you're like, uh, it's a good day.
00:04:47.180 It's yeah.
00:04:47.960 It's just, yeah.
00:04:48.640 It's a cup of coffee.
00:04:50.440 Yeah.
00:04:50.800 If a big fella just squeezes you, you can't handle it.
00:04:53.220 That's a cup of coffee.
00:04:54.140 Better than caffeine for me.
00:04:55.200 Yeah.
00:04:55.400 I don't know.
00:04:55.920 I, I'm, I'm just not into that stuff.
00:04:57.800 Is there a, like a, is there like, um, a vitamin or some staple that you kind of keep
00:05:03.380 in your diet?
00:05:04.020 If it's not caffeine, is there some like, I mean, I drink a very large amount of protein.
00:05:07.260 I would creatine.
00:05:08.340 Do you got that?
00:05:09.060 I mean, I don't know.
00:05:09.820 Yeah.
00:05:10.460 Um, I don't know.
00:05:11.920 Oh, vitamin D, like all that stuff.
00:05:13.180 Uh, yeah.
00:05:13.580 Vitamins good.
00:05:14.600 But, um, but like, yeah, it's not your thing.
00:05:20.200 No, no.
00:05:21.360 Never been my thing.
00:05:23.020 I, um, you know, we're not, we saw each other at the UFC and you and your wife were there,
00:05:29.860 right?
00:05:31.680 Uh, yeah.
00:05:32.340 Which one was that?
00:05:33.120 I'm trying to think.
00:05:33.720 It wasn't 312.
00:05:35.660 Was that, that was Alex fighting Magomed.
00:05:38.060 Oh yeah.
00:05:38.600 Yeah.
00:05:39.200 Yeah.
00:05:39.520 I was sad about that, man.
00:05:40.940 Yeah.
00:05:41.560 Yeah.
00:05:42.100 Did your wife, I could see, she could like really, it, it like, she was like head in
00:05:46.640 her hands sometimes.
00:05:47.420 Yeah.
00:05:47.660 No, I think, um, she, uh, I think she would say she enjoys it, but I think she mostly goes
00:05:53.800 to, uh, to support me.
00:05:55.700 Um, yeah.
00:05:56.500 No, she's into it.
00:05:57.240 I think, um, you know, it's tough.
00:05:58.800 I mean, we've gotten to know a bunch of the fighters and it's just like when you see someone
00:06:02.820 who, you know, and like get hit or go down, that's like, that's tough.
00:06:06.940 Right.
00:06:07.160 I mean, it's like, I I've gotten to, uh, like train with Volk a few times and, um, you know,
00:06:13.380 we were there at 298 when he fought Ilya and like, oh yeah, that was like, and that was
00:06:17.220 like really tough.
00:06:17.880 Did you get to walk out with him or no?
00:06:19.400 Yeah.
00:06:19.640 I mean, he asked me to, to, to walk out with him and I was like, all right, yeah, this
00:06:22.340 will be a cool experience.
00:06:23.280 And then I'm just like, you know, standing there while they're walking.
00:06:26.840 Did you see that whole meme where, where they're, uh, like he's like passing off all his
00:06:31.000 clothes and I'm just like sitting there like, uh, useless.
00:06:35.560 Yeah.
00:06:35.700 No, there you go.
00:06:36.220 That's, that was, uh, that was a fun, that was a fun one.
00:06:38.460 Was that scary?
00:06:39.240 Like what, cause I guess you feel like, yeah, what do I do now?
00:06:42.160 No, I mean, this was a fun moment, but, but I mean, but watching him, um, get hit by Ilya
00:06:48.200 so close, I mean, it was like right cage shot.
00:06:50.100 It was like three feet from us.
00:06:51.240 And I mean, he's like a big, he's like a tough guy.
00:06:53.360 Right.
00:06:53.660 And, uh, so that, that was, that was tough.
00:06:57.340 Um, but so I think after that Priscilla was like, oh man, I don't know if I can go watch
00:07:03.200 Vulcan in person again, but, but we, I mean, you know, I mean, we just have friends over.
00:07:07.400 It was fun watching him fight Diego.
00:07:08.640 It was good.
00:07:09.120 Yeah.
00:07:09.320 It was a great fight.
00:07:09.900 Great.
00:07:10.520 Um, it was awesome.
00:07:11.340 Tough, tough moments in, in round two and four, but he's his heart, man.
00:07:16.560 Like, I mean, the way he described it after of like, did you see the, the shot in round
00:07:20.020 four where Diego grazed his eyelid with his glove?
00:07:24.380 Yeah.
00:07:24.720 And he lost his vision and, and he like, and you could see it was kind of like, like
00:07:28.720 just the head movement and everything trying to stay in a good place.
00:07:31.960 It's like watching an animal try to survive on one of those animal planet shows or something
00:07:35.400 when you're cheering for the, you know, you're like, oh, it's not going to go well, but then
00:07:38.540 he does good.
00:07:39.300 Yeah.
00:07:39.700 No.
00:07:40.220 Um, he, he's an amazing guy.
00:07:42.740 Um, he's, he's, he's really cool.
00:07:44.240 Very talented.
00:07:44.940 A lot of heart.
00:07:45.640 Oh yeah.
00:07:46.060 And you get to know some of the fighters.
00:07:47.520 I think that's one thing that, um, is great about social media these days that you get to
00:07:51.580 know the fighters a little bit more.
00:07:52.700 Some of the regular life, like you can get involved in who they are.
00:07:55.760 Yeah.
00:07:55.960 It gives you so much more of a person to cheer for.
00:07:58.720 Um, is it tough to take your wife since you're a, um, and I'm just going to, just going to
00:08:04.260 go for it.
00:08:05.620 Since you're a, I'm going to say the word really fast, just so it's like, uh, so we don't
00:08:09.200 dwell on it.
00:08:09.680 But since you're a billionaire, is it tough to take your wife on a date?
00:08:14.000 Like at that, you know, like, do you have to live up to a standard or what's like a debt,
00:08:17.740 like a nice date night?
00:08:18.740 What is that like?
00:08:20.440 Um, I don't know.
00:08:22.420 No, I think she's pretty chill.
00:08:24.280 Um, she's a chillionaire, huh?
00:08:28.220 Sorry.
00:08:28.880 There you go.
00:08:29.720 My niece will like it.
00:08:30.540 Yeah.
00:08:30.740 There you go.
00:08:31.160 Um, no, I mean, I think the main thing for me is like, like life is busy.
00:08:37.840 There's like a million things that I could be doing at any given point in time.
00:08:40.900 I just think it's important to like take time, you know, each week, like, you know, Wednesday
00:08:46.040 night, we really try to have a date night.
00:08:48.380 Um, you know, try to hang out with the kids and put them to bed every night.
00:08:51.480 It's like, that's just like an important part of my routine.
00:08:53.600 I think that's important stuff to do, but, but no, I mean, uh, we, we try to like go out
00:08:58.500 somewhere, but every once in a while we'll just cook or eat at home and that's all good.
00:09:02.760 The UFC stuff, I think is probably more for me than for her, but she's, she's, she's a
00:09:06.640 good sport about it.
00:09:07.340 Is there something you'll do for her?
00:09:08.800 Like, or is there like a fancy date?
00:09:10.380 Cause you can, you can afford to take your wife on like a date of like that, that a lot
00:09:14.440 of us could only dream of, right?
00:09:15.540 The most, that most people could only dream of.
00:09:17.720 Is there something that you like, uh, is there some magical date that you took her on one
00:09:22.340 time?
00:09:23.420 Um, I don't know.
00:09:25.560 That is a good question.
00:09:27.120 Let's see.
00:09:28.500 Um, she's, she's pretty simple on this stuff.
00:09:31.440 I think most of the, you know, I, I like doing, I like making things right.
00:09:35.860 So I don't know if you saw this thing.
00:09:37.000 I like, I, I like working with like great artists and stuff.
00:09:40.980 So I did this project where I've always admired Daniel Arsham and, and, um, he's, he's this
00:09:46.720 like great sculptor and we, and I worked with him to make this sculpture of Priscilla because
00:09:52.140 I thought it was cool.
00:09:53.360 And, um, yeah, you're pulling this up and the.
00:09:58.500 And like, oh, first of all, I think it's like, you know, make a sculpture of Priscilla partially
00:10:05.580 because that's cool.
00:10:06.440 Partially.
00:10:06.780 Cause I'm like, I'm not going to make a sculpture of myself.
00:10:08.520 It's like that.
00:10:09.100 That's crazy.
00:10:09.660 Right.
00:10:09.960 It's like, well, we're like, who the fuck does that?
00:10:11.640 But like, um, so, so it's like, she's kind of the target of my creative energy in a lot
00:10:17.480 of places in a way.
00:10:18.420 So there's all these memes online after.
00:10:20.980 Yeah.
00:10:21.080 There you go.
00:10:21.740 Where people like, wow, I wonder what Zuckerberg did wrong that he had to make a sculpture of
00:10:27.240 his wife.
00:10:27.900 And I'm like, no, you guys are totally missing this.
00:10:30.540 This is the thing I did wrong.
00:10:32.220 It's like, you can't like, you're gonna have to wait to see what I have to do to make up
00:10:35.180 for having made a sculpture and putting it on our front lawn.
00:10:38.120 It's like, she didn't want a sculpture of her in the front lawn.
00:10:40.440 It's like, that's, that's weird.
00:10:42.100 Right.
00:10:42.340 It's like, you know, it's, but, um, no, but she, she's a good sport about it.
00:10:46.080 And, um, that, and at least you can tell the door dash guy, like just set it by the
00:10:49.340 sculpture by the sculpture.
00:10:50.740 Yeah.
00:10:51.360 Yeah.
00:10:51.660 That's kind of nice.
00:10:52.400 It becomes like, um, yeah.
00:10:53.820 And I think having like a woman as you look out in your yard is kind of nice, you know?
00:10:57.660 Yeah.
00:10:57.880 No, it's got like a good angelic form factor thing.
00:11:00.320 It's a, it's a good vibe.
00:11:01.200 Did she make, you make any adjustments to it or was she like, okay, I'll accept it as is?
00:11:05.240 Um, no.
00:11:06.840 I mean, I don't think she was that happy with it.
00:11:10.180 Um, cause again, I, I mean like who wants a sculpture of themselves and in the, in the,
00:11:14.540 in the front yard, but, um, but you know, I think she thinks it's sweet and I think she
00:11:20.840 appreciates that.
00:11:22.800 Um, I mean, there's a lot of like more destructive things I could be doing with my creative energy.
00:11:26.540 Yeah.
00:11:26.680 You made an effort too.
00:11:27.720 Yeah.
00:11:27.920 Yeah.
00:11:28.080 Yeah.
00:11:28.240 So, um, and so it's like, I, I designed her this, um, this, uh, Porsche minivan thing.
00:11:37.320 It's like, we took like a Cayenne and extended it to be bigger.
00:11:40.080 And it's like, I do that.
00:11:40.660 Cause it's like, all right, I like cars, but I'm not gonna design a super car for myself.
00:11:43.680 Let's like just design a sweet minivan for my wife.
00:11:46.680 Right.
00:11:46.940 It's like, yeah, it was just fun.
00:11:48.180 Yeah.
00:11:48.740 Yeah.
00:11:49.080 Yeah.
00:11:49.400 So I guess creative stuff like that, trying to be creative and show her some creativity.
00:11:54.140 Oh, that's nice.
00:11:54.820 There you go.
00:11:55.340 I mean, look at that.
00:11:56.140 And where did you meet your wife at?
00:11:58.340 Um, I met her in school.
00:12:00.840 So we were in college and I had just done this prank.
00:12:06.660 Um, and all my friends and everyone was convinced, we were all convinced I was going to get thrown
00:12:10.920 out of school.
00:12:11.500 Okay.
00:12:11.760 And where were you in school at, Mark?
00:12:13.440 Harvard.
00:12:13.980 Okay.
00:12:14.220 And so, so I was a sophomore.
00:12:16.260 She was a freshman.
00:12:17.920 Um, my friends were convinced I was about to get kicked out of school.
00:12:21.160 I was like going in front of this like trial for the kind of like school discipline committee.
00:12:26.160 Um, my friends threw a going away party for me and it was in the bathroom line at the
00:12:32.740 end of the party.
00:12:34.500 Um, where I was just like next to Priscilla and we were talking and nothing to lose.
00:12:40.220 It was your last meal.
00:12:41.440 I was just like, look, man, like it's, you know, have pregnancy, man, but it's, um, uh,
00:12:47.780 it was not, not very romantic situation.
00:12:50.120 It's like, we're waiting in line for the bathroom.
00:12:51.620 She's just funny.
00:12:52.300 She, she, uh, she's cute.
00:12:54.200 And it's like, I was like, all right, Hey, you know, if we're going to go out, uh, we
00:12:58.540 better go do this quickly.
00:12:59.880 Cause I'm probably going to get kicked out of school in like two or three days.
00:13:02.480 It's like, that's a good pickup line.
00:13:03.840 Right.
00:13:04.040 It's like, yeah, that's, uh, that's how you really show that you have potential in the
00:13:07.560 world.
00:13:07.860 Right.
00:13:08.180 It's, you know, it's, well, it's a limited time offer.
00:13:11.100 Yeah.
00:13:11.420 But it's not like very aspirational.
00:13:13.120 It's like, Hey, that's true.
00:13:14.160 It's like, I'm, um, going nowhere in life and, um, I'm, I'm about to get kicked out of
00:13:18.960 school, so you're going to want to, you know, you're, you're going to want to go out with
00:13:22.540 me really quickly.
00:13:23.560 Um, right.
00:13:24.100 It's like, yeah, yeah.
00:13:24.920 I guess another way to look at it is very sail rack in a way, you know, very sail rack.
00:13:28.640 Yeah.
00:13:28.800 So, but, but I mean, but everyone I knew thought that this whole thing was over.
00:13:31.880 I mean, my parents drove up, we lived in New York.
00:13:34.720 They drove up to help me pack up my dorm room cause they're like, it's over.
00:13:38.800 You're coming home.
00:13:40.080 Yeah.
00:13:40.300 Um, but, um, it didn't then of course I made Facebook and a few months later I dropped out
00:13:47.040 anyway.
00:13:47.260 So jokes on them, but, um, but it's, uh, but that's, that's how, uh, yeah, no, that's
00:13:53.020 how you have to go to the trial at school or you didn't.
00:13:55.100 Oh no, I did.
00:13:55.880 And it was like the questions that they asked were even worse than the questions that I get
00:14:02.920 at like the congressional hearings.
00:14:04.280 It was like, don't you know that once you put this terrible prank website online, it
00:14:10.260 is there forever.
00:14:11.360 And it's like, no, actually that's not how it works.
00:14:13.680 The site's already down.
00:14:14.600 Um, they're like, you're a smart ass.
00:14:16.980 And I'm like, yeah, that's probably true.
00:14:20.360 But, but, but half of a smart ass also is smart.
00:14:25.120 Yeah.
00:14:25.560 So that's good.
00:14:26.120 I'll take that.
00:14:26.520 Yeah.
00:14:26.720 It's better than being a dumb ass.
00:14:27.820 Yeah.
00:14:28.040 Well, you know, it's like maybe smart and a little bit of an ass and it works.
00:14:32.320 Yeah.
00:14:32.500 I'd rather be a smart ass.
00:14:33.500 I think.
00:14:34.140 Um, do you know about, um, what was a prank site?
00:14:36.800 What was it?
00:14:37.200 That wasn't Facebook though.
00:14:38.280 No, no.
00:14:38.960 It was this thing called FaceMash, um, which, you know, in the whole lore of the thing, there's
00:14:43.800 this whole movie that got made about all this stuff.
00:14:45.400 They made it seem like FaceMash was a predecessor to Facebook.
00:14:47.760 It wasn't.
00:14:48.160 I, when I was in college, I, I just like making things.
00:14:50.600 So whether it's statues or minivans or internets or glasses, like whatever, you know, it's
00:14:56.180 just like making stuff.
00:14:57.120 So you like being creative.
00:14:57.940 Yeah.
00:14:58.160 So I, so like I, um, uh, so it was, it was, it was, it was very mean spirited.
00:15:06.880 It was very mean spirited.
00:15:07.940 I basically, I downloaded everyone's ID photos, um, from their ID cards and I made the site
00:15:14.020 where it showed two photos and you clicked on the person you thought was more attractive.
00:15:20.080 And then it used, and then it basically looked, took all the matchups and ranked everyone
00:15:24.340 in the school based on, um, who everyone thought was the most attractive, very mean spirited
00:15:30.200 in retrospect, um, not connected to Facebook in any way, but just like, uh, just, just like
00:15:35.540 just a college kid kind of being a jerk.
00:15:38.180 Um, but you know, it's like, okay, so that was not, not my, not my best move.
00:15:43.440 Um, but you got to take, you know, it's baby steps.
00:15:47.200 I mean, yeah.
00:15:48.420 Also pretty cool to make something.
00:15:49.920 Some of the things I did were, were, were useful and fun.
00:15:52.640 That one, I put it together in a weekend, not my best work as, um, yeah, no, not my
00:15:57.780 best work, but that's fair.
00:15:59.200 Not your best work, you know?
00:16:00.720 Not my best work.
00:16:01.040 Yeah.
00:16:01.160 We've all had things that we made that weren't, you know, that weren't the best probably.
00:16:04.360 Did you, um, let me think what I'm going to ask you.
00:16:07.200 Oh, did you, um, what kind of car do you drive?
00:16:12.160 What kind of car do you drive?
00:16:13.060 Uh, right now, Blackwing CT, CT5, CTB.
00:16:17.780 Ooh, what is it?
00:16:18.540 Um, I mean, it's a Cadillac.
00:16:19.700 It's nice.
00:16:20.200 Oh, it is?
00:16:20.760 It's, um, it's, uh, here, pull it up.
00:16:23.500 It's, uh, I really like driving manual transmission cars.
00:16:27.320 Oh, yeah.
00:16:28.000 And, um, I like that type of thing, baby.
00:16:31.800 It's, uh, it's a good one.
00:16:34.540 Ooh, God, that's nice.
00:16:36.120 Yeah.
00:16:37.280 That thing will just shift.
00:16:38.740 That thing will just get shifty out there.
00:16:40.500 No, it's good.
00:16:41.280 It's good.
00:16:42.020 It's got a nice little ass on it.
00:16:43.640 I feel like you want your car to have almost as much horsepower as your helicopter.
00:16:48.580 I feel like it's a rule of thumb.
00:16:51.100 It's probably a rule of a very rich thumb.
00:16:53.600 Yeah.
00:16:53.760 Yeah.
00:16:53.860 So for a while, okay, so my security team kind of convinced me for like 10 years that
00:17:02.700 I should just let them drive me places, which I mean, realistically, I probably should.
00:17:08.100 But, um, but then eventually I was just like, I can't, I can't, I can't do this.
00:17:13.380 It's like, I, like, I need like the freedom.
00:17:15.360 I need to be able to drive myself.
00:17:16.380 So I started learning how to fly helicopters and then I was like, all right, well, this is
00:17:22.280 just ridiculous.
00:17:22.920 It's like, we have the security team driving me to my helicopter that I then go fly away.
00:17:27.480 It's like, that makes no sense.
00:17:30.100 It's like, I'll just, let's just go, let's get a car.
00:17:32.400 It's kind of Batman.
00:17:33.560 It's almost as a Batman, but I guess it is true.
00:17:35.700 Cause then you're just by yourself out there.
00:17:37.780 What do you mean?
00:17:38.440 Or will you take somebody with you in the car?
00:17:39.780 Oh no, no, no.
00:17:40.480 In the car, the helicopter.
00:17:41.740 In the helicopter.
00:17:42.080 Helicopter.
00:17:42.620 Yeah.
00:17:42.880 I definitely have like a real professional pilot fly with me.
00:17:46.240 Yeah.
00:17:46.500 Yeah.
00:17:46.760 Yeah.
00:17:47.220 Oh, that's wild though.
00:17:48.440 Yeah.
00:17:48.660 But I guess a chopper would get you somewhere pretty quick, right?
00:17:51.340 Uh, yeah.
00:17:52.760 I mean, it's a, it's a good, it's a good tool to have in the arsenal.
00:17:55.540 Unreal.
00:17:56.000 No, it's, um, California.
00:17:57.080 You're like a damn emperor, dude.
00:17:58.620 Does that feel like that ever?
00:17:59.680 No, no, no, no.
00:18:00.100 Um, man, it's like, I feel like I wish that people did what I wanted them to do.
00:18:05.340 It's like, that would be, that would be fun.
00:18:06.720 But it, but in the meantime, you can have some fun.
00:18:08.440 Yeah.
00:18:09.020 Oh yeah.
00:18:09.580 I would have a chopper.
00:18:10.480 I'd have a, I don't even know what I would have.
00:18:12.400 I would have an underground tunnel, even though it came right back up next to where it started.
00:18:16.560 I do have an underground tunnel.
00:18:17.940 Do you really?
00:18:18.440 Yeah.
00:18:19.880 Yeah.
00:18:21.160 In USA?
00:18:22.320 Well, I have this ranch in Kauai.
00:18:23.680 Yeah.
00:18:24.200 Where there's this whole thing where people are like, there's this whole meme about how
00:18:27.640 people are saying I built this like bunker underground.
00:18:30.560 It's, it's like more of underground storage type of situation.
00:18:33.920 But, um, but yeah, no, it's, uh, it's, uh.
00:18:36.880 Oh, wow.
00:18:38.540 Zucky got that bunky.
00:18:40.400 What's under the ground is more water, right?
00:18:43.040 It's basically what you just said.
00:18:44.360 It's sort of a tunnel that just goes to another building.
00:18:48.680 Yeah.
00:18:49.920 But it's.
00:18:51.080 It's a good place to hide a little bit of dope though.
00:18:53.160 Well, that's what I would say, dude.
00:18:55.100 Those are the good.
00:18:55.960 That's what I would do anyway.
00:18:57.480 But, um.
00:18:59.460 You and I defer there.
00:19:00.640 Yeah.
00:19:01.900 So you don't use, you've never used drugs or.
00:19:04.100 I mean, I don't know.
00:19:05.460 DARE really worked on me.
00:19:06.700 When I was in third grade, it's like, I don't know.
00:19:08.660 There's all this stuff about how it like, how it backfired and like, it just kind of
00:19:12.440 taught people how to use drugs.
00:19:13.560 For me, it like really scared me.
00:19:16.120 It was like, I don't know.
00:19:16.980 I don't like any of that stuff.
00:19:18.040 It's like a friend will show up and be like, oh, I was like getting, um, an IV to feel better.
00:19:22.380 It's like, I don't even want people to like extend their arm and show me their vein.
00:19:25.580 Like, it's like that shit.
00:19:26.720 Like, no, not me.
00:19:28.020 Yeah.
00:19:28.220 That's true, dude.
00:19:29.200 No, I'm, I'm raw dogging reality as my sister says.
00:19:31.840 Yeah.
00:19:32.100 No, that's.
00:19:32.900 There's kind of a, that's kind of insane really these days.
00:19:36.100 It used to be kind of that somebody who was like really straight edge and sober that they
00:19:40.620 was, that that was a nerdy thing, I think.
00:19:43.000 But now I never even thought about it before.
00:19:46.020 Now that's almost the most insane thing you can do.
00:19:49.120 It's like, wait, you're under the influence of nothing, nothing, nothing from sunup to
00:19:55.000 sundown.
00:19:56.180 You're a fucking animal.
00:19:57.720 You know, it's kind of crazy that it's that, that things have gone.
00:20:01.400 That's that, uh, that, that the perspective of that has kind of changed.
00:20:06.120 How big is that tunnel?
00:20:07.340 Is your tunnel pretty big or what's it like?
00:20:09.020 It's not that big of a tunnel.
00:20:10.180 It's, uh, I made a, I put a reel on Instagram one day of, um, of, of Priscilla kind of making
00:20:16.060 fun of me playing video games with some friends down there, but it's, uh, uh, yeah, no, it's,
00:20:20.840 it's, it's, it's all good.
00:20:23.020 Um, yeah, it's definitely, but this is a crazy area.
00:20:25.940 I haven't spent a lot of time.
00:20:26.960 You know how the internet is.
00:20:27.680 The internet will, will, will always make things seem like they're crazier than they are, but
00:20:31.800 you know, what's an under, I mean, in Hawaii, it's like having a, having a little storm
00:20:36.540 shelter underground tunnels.
00:20:37.600 It's pretty sweet.
00:20:38.140 That's the thing right there.
00:20:39.620 No, I don't think so.
00:20:41.200 That's insane.
00:20:42.500 Is that Roblox?
00:20:44.760 Who built it for you?
00:20:45.940 It looks closer to Roblox than, uh, than, than what, than what the thing is.
00:20:49.300 Dude, the future's death.
00:20:50.320 Things are definitely getting weird with cars is, dude, I saw four Waymos meeting up behind
00:20:55.620 the Ikea over here.
00:20:56.720 Having a meeting?
00:20:57.320 Huh?
00:20:57.600 Yeah.
00:20:57.860 Deciding what they're going to do next.
00:20:59.080 It's just like, what's your next move?
00:21:00.580 I was like, where are you going?
00:21:02.060 Who are you?
00:21:02.560 Yeah.
00:21:02.840 I think it was.
00:21:03.260 Have you seen Hot Tub Time Machine?
00:21:04.520 Um, I haven't seen it.
00:21:05.460 You would like that.
00:21:06.800 Really?
00:21:07.060 That's good.
00:21:07.400 Yeah.
00:21:07.680 No, I think that there's a, there's the self-driving car that's like hunting down the guy who was
00:21:11.460 mean to it and it's, it's, uh, it's good.
00:21:13.800 Yeah.
00:21:13.940 That's what it's.
00:21:14.520 Yeah.
00:21:14.760 That's what it seemed like.
00:21:15.600 Dude, there was four, I don't know if they were smoking a blunt or whatever, but there
00:21:18.080 was four Waymos all meeting.
00:21:20.300 Yeah.
00:21:20.620 I don't know what they were meeting up.
00:21:21.980 I'm like, what, who are they, what are they doing?
00:21:23.960 You know?
00:21:24.760 And then my buddy said he was in one and it was, um, crying to him because it like.
00:21:30.520 The Waymo was crying?
00:21:31.360 Yeah.
00:21:31.660 The Waymo was like complaining about his spouse or whatever.
00:21:34.240 And it was like.
00:21:35.120 What form does that take?
00:21:36.200 I don't know.
00:21:37.340 Is it like speaking English or is it like R2-D2?
00:21:39.320 No, I think he had said it on a British setting.
00:21:41.200 So it was like, oh, me missus is really getting up, me know?
00:21:43.920 So I think cause you can change the voice to like Indian guy or British guy or whatever
00:21:48.320 or like, um, or, uh, female or semi-female or whatever.
00:21:52.500 Semi-female.
00:21:53.440 But like, yeah, I just think it was like, yeah, just some of that Waymo stuff's just getting
00:21:58.320 out of line.
00:21:58.760 I never thought like, oh, what are these cars doing until I saw four of them meeting up?
00:22:04.740 And I was like, this seems like a lot to me, you know?
00:22:08.280 That's where the future starts to get a little bit scary is moments like that where you're
00:22:11.100 like, well, what are they doing?
00:22:12.480 And what were they doing?
00:22:14.840 Probably just waiting to pick people up.
00:22:17.040 But they'd have to do it in a group behind an Ikea?
00:22:20.460 Yeah, I don't know what that's, that does seem like an unlikely place to pick people
00:22:24.040 up.
00:22:24.400 That's true.
00:22:25.140 And will it just seem-
00:22:25.940 I'm not going to defend them.
00:22:26.980 Okay.
00:22:27.300 Okay.
00:22:27.700 Okay.
00:22:27.980 It just seemed like an interesting place to meet up, you know?
00:22:33.100 Yeah.
00:22:33.460 And then I was in, I was in one Waymo and it was like, do you want to gamble?
00:22:38.020 And I think it was like sponsored by DraftKings or something.
00:22:40.000 And it was like, you want to gamble with me?
00:22:42.080 In the Waymo?
00:22:43.080 Yeah.
00:22:43.380 I was like, it's like, I bet you $40 you'll never get where you're going.
00:22:47.120 And it was like, this is, this seems-
00:22:49.100 As the doors lock.
00:22:50.000 Yeah, yeah.
00:22:50.580 It's like, picks up speed.
00:22:52.440 Yeah, yeah.
00:22:53.600 And why are all the Waymo's white too?
00:22:55.640 I'm like, this is, I don't know.
00:22:57.940 I just think we got to start to diversify the portfolio, I feel like, a little, you know?
00:23:01.600 Yeah.
00:23:02.280 But what do I know?
00:23:04.980 You, so, let me switch topics.
00:23:08.500 Sorry, I sometimes get a little bit nervous.
00:23:09.720 Is that all right?
00:23:10.340 Yeah.
00:23:10.980 Okay.
00:23:11.300 I mean, there's no way that you're less nervous than me.
00:23:15.920 Let me think about how you said that.
00:23:18.980 Yeah.
00:23:19.280 Let me think about that too.
00:23:20.320 Did I, more nervous, more nervous.
00:23:22.140 Yeah.
00:23:22.340 See, I didn't even say it correctly.
00:23:23.880 Really?
00:23:24.340 Oh, but I think that's a trap.
00:23:25.620 I think that I am, I think I'm more nervous than you.
00:23:28.620 You think?
00:23:29.120 Yeah, man.
00:23:29.880 I wake up, I am an alarm clock.
00:23:33.100 That's how I feel all the time.
00:23:34.380 I feel like-
00:23:35.260 What does that mean?
00:23:36.060 I just feel like I'm always, you know,
00:23:39.560 Why?
00:23:41.860 You know?
00:23:42.300 I don't know.
00:23:43.100 I just think I've always felt like that.
00:23:44.260 I've always felt like kind of frenetic, you know?
00:23:46.600 It's like, how do I get things to calm down
00:23:49.100 more than how do I get things to amp up?
00:23:51.120 You know?
00:23:52.800 Like, I'll even wear earplugs a lot of the day now,
00:23:55.200 and it makes things a lot easier for me to kind of navigate.
00:23:58.080 Interesting.
00:23:58.680 Yeah.
00:23:58.900 It just makes everything easier to focus on.
00:24:00.940 It makes it easier if I'm doing sauna, steam bath,
00:24:03.980 ice bath, working out, any of those things.
00:24:05.880 It makes it everything-
00:24:06.800 You wear earbuds in the-
00:24:08.260 I'll wear earplugs.
00:24:09.600 Oh, I see.
00:24:10.280 Yeah.
00:24:10.580 And I'll wear them all night while I was.
00:24:11.640 It just makes everything a lot easier.
00:24:13.160 Yeah.
00:24:14.260 Ready to win some real cash during the basketball playoffs?
00:24:19.200 Check out Pick 6 from DraftKings.
00:24:23.040 When it comes to basketball payouts, DraftKings,
00:24:26.700 Pick 6 posterizes the competition.
00:24:30.300 Hit all your picks and score higher minimum payouts on Pick 6,
00:24:34.780 plus even more cash if you outscore the competition.
00:24:38.620 Pick 6 is available in most states, including Missouri, California, Texas, Georgia, and more.
00:24:45.760 I like to pick Kevin Durant, the Durantula, and more than 22 points.
00:24:53.460 Or sometimes I'll pick a Zerbzlirbiz Slingbonis Brageek and more than 11 rebounds.
00:25:03.560 It's up to you.
00:25:05.240 New players get 50 and pick six credits instantly on just a $5 entry.
00:25:11.240 Download the DraftKings Pick Six app now and use code Theo.
00:25:16.080 That's code T-H-E-O for new customers to play $5.
00:25:19.940 Get $50 in Pick 6 credits.
00:25:23.360 Better payouts, bigger wins.
00:25:25.040 Only with Pick 6 from DraftKings.
00:25:28.580 The crown is yours.
00:25:30.820 Make sure to be responsible.
00:25:32.860 Gambling problem?
00:25:33.800 Call 1-800-GAMBLER.
00:25:35.080 Help is available for problem gambling.
00:25:36.840 Call 888-789-7777 or visit ccpg.org in Connecticut.
00:25:41.960 Must be 18 and over.
00:25:43.100 Age and eligibility restrictions vary by jurisdiction.
00:25:45.720 Pick 6 not available everywhere, including New York and Ontario.
00:25:48.440 Voidware prohibited.
00:25:49.520 One per new customer.
00:25:50.820 Bonus awarded as non-withdrawable Pick 6 credits that expire in 14 days.
00:25:54.840 Limited time offer.
00:25:55.860 See terms at pick6.draftkings.com slash promos.
00:25:59.360 Hi, everyone.
00:26:00.600 So I've been telling you guys about MoonPay for a while now, right?
00:26:05.580 And honestly, I'm not going to stop because it's just that good.
00:26:09.880 You've heard me talk about the MoonPay app.
00:26:12.820 And for good reason.
00:26:14.140 It's incredibly user-friendly for buying and selling crypto.
00:26:17.420 But today, I want to share more about what makes MoonPay so cool.
00:26:22.980 MoonPay powers the entire world of crypto.
00:26:26.780 Some people say they are the PayPal of crypto because just like you could use PayPal to buy
00:26:33.240 anything on the internet, now you can use MoonPay to buy anything in crypto.
00:26:38.360 And while we're on the subject of cool MoonPay partners, you've definitely got to check out Uniswap.
00:26:44.160 It's your go-to decentralized exchange where swapping tokens is as easy as Pi.
00:26:49.980 And guess what?
00:26:50.960 You can use MoonPay to buy more crypto for your swaps.
00:26:53.860 Whether you're a newbie or a seasoned pro, Uniswap lets you trade directly from your wallet,
00:26:59.300 keeping things smooth and simple.
00:27:01.420 Remember, while MoonPay makes buying crypto straightforward, it's essential to do your own research
00:27:06.340 and understand the risks involved.
00:27:09.240 Crypto trading can be volatile and you could lose your investment.
00:27:13.140 MoonPay is a tool to facilitate your transactions, not a source of financial advice.
00:27:18.100 Trade responsibly.
00:27:21.660 You dropped out of college, right?
00:27:25.260 Yeah.
00:27:25.560 And Alexander Wang, he came on one time, and we were talking about him earlier, but he had
00:27:31.520 dropped out of college also.
00:27:33.380 Do you think people need college still?
00:27:37.380 Because it's just, you know, you have these creative guys who are having success and they
00:27:41.420 didn't go through college.
00:27:43.140 Do you feel like people still need college?
00:27:45.220 What do you feel like that looks like for now in the future?
00:27:48.840 I don't know.
00:27:49.200 Well, I mean, college, there's a question of how much of it is about the learning and how
00:27:54.880 much of it is about the kind of like learning how to be a grownup before you kind of go
00:28:01.940 out into the world.
00:28:04.040 I mean, for me, it's like the classes were fine.
00:28:07.600 I mean, that was fun, sort of entertaining part of college.
00:28:10.460 But I mean, I met a lot of people who are really important in my life, right?
00:28:13.420 It's like, I mean, Priscilla, my co-founders at the company, a bunch of people who are still
00:28:18.820 friends, close friends to this day.
00:28:20.860 Um, so I think that's almost more of it than, um, than like whatever class you took.
00:28:30.960 Right.
00:28:31.480 Um, but yeah.
00:28:33.780 Yeah.
00:28:33.940 That's just, that's social is learning to be around others, learning to not be at your
00:28:37.620 parents' house.
00:28:38.760 Yeah.
00:28:39.040 And I mean, and I went to boarding school for two years.
00:28:41.440 Um, and then before I went to college for two years.
00:28:44.220 So I feel like even though I dropped out of college, I kind of got a full experience in
00:28:48.400 a way, um, on that, but it was good.
00:28:51.920 I feel like you just like, you know, you need some time kind of away from home a bit before
00:28:56.220 you like fully go out.
00:28:57.900 But, um, so I don't know.
00:29:00.940 I think that'll be a thing.
00:29:02.420 Um, but I do think like a lot of people, I'm not sure that college is preparing people
00:29:07.600 for like the jobs that they need to have today.
00:29:09.500 I mean, I think that that's like, there's a big issue on that and like all the student
00:29:12.160 debt issues are like really big issues.
00:29:15.120 I mean, the fact that college is, it's just so expensive for so many people.
00:29:18.680 And then like you graduate and you're in debt, um,
00:29:21.920 Well, you're not even guaranteed a job either.
00:29:23.400 You would think at a certain rate you're paying, you'd be guaranteed some sort of beginner
00:29:28.640 employment.
00:29:29.320 Yeah.
00:29:29.440 And I think that's, that's probably the big, the biggest issue with it is it would be one
00:29:32.920 thing if it were just kind of like a, a social experience, but you started off neutral.
00:29:36.800 The fact, if it's not preparing you for the jobs that you need and you're kind of starting
00:29:41.100 off in this big hole, then I think that's, that's not good.
00:29:44.980 I mean, that, that I think there's going to have to be a reckoning with and people are
00:29:48.180 going to have to, to kind of figure out whether that makes sense.
00:29:50.440 But I don't know, people, it's sort of been this taboo thing to say of like, like maybe
00:29:56.100 not everyone needs to go to college and cause there's like a lot of jobs that don't require
00:30:00.400 that.
00:30:00.880 And, um, but I think people are probably coming around to that opinion a little more now than,
00:30:06.400 than, um, than maybe like 10 years ago.
00:30:08.460 Yeah.
00:30:08.960 Do you think, I was just talking earlier, um, with Colin, I think one of your assistants,
00:30:13.720 one of your coworkers, sorry, I didn't want to say assistants, but, um, and we're kind
00:30:18.640 of saying that, um, yeah, like what classes do you think like kids should be learning now?
00:30:24.760 Because like with AI coming along and with technology starting to like really multiply
00:30:32.660 itself pretty quickly, like our ability to advance is, is going to only grow faster.
00:30:37.420 It seems like, would you feel like that's true overall statement?
00:30:40.100 Oh yeah, totally.
00:30:40.900 I think it's accelerating.
00:30:42.240 Yeah.
00:30:42.380 Accelerating.
00:30:42.820 Yeah.
00:30:42.920 That's the word I'm looking for.
00:30:43.720 Sorry.
00:30:44.000 Yeah.
00:30:44.140 So with that happening, what are like, I feel like they should be teaching how to like kind
00:30:48.860 of use AI to children right now in elementary and middle schools.
00:30:52.860 Does that, that sounds very real to me.
00:30:54.860 Do you feel like that that's like not, not how to code or anything, but just how to do
00:30:58.840 it?
00:30:59.460 So, I mean, it's interesting because the technology changes a lot, right?
00:31:02.540 And it's obviously, it's a lot different now than it was 20 years ago when I got started
00:31:06.160 with the company or when I started coding when I was a kid.
00:31:08.920 So, it's not like the specific things I learned to code when I was 15 are the skills I'm using
00:31:16.140 today, but you, I don't know, I, I, I think that there is something about kind of understanding
00:31:24.620 the technology and understanding how to use it and getting on that train that I think
00:31:29.300 is, is valuable.
00:31:30.320 Right.
00:31:30.460 But the other thing is, I just think like having good mentors or teachers, no matter what
00:31:35.440 the actual classes, like when I was in boarding school, um, I really liked studying Latin
00:31:43.560 and Greek and that's like not useful for any practical thing, but it's, it's like, uh, but
00:31:50.360 it's, it's fun and like, it's fun, it's fun.
00:31:54.160 I think it's, there are parts of it that are fun for sure.
00:31:56.220 And the tests that, that I had to take were, you know, it's, they would, you know, you'd be
00:32:01.400 reading these kind of great works, um, and the test would be, they'd pull out any word
00:32:08.480 in like whatever the, you know, third of a book was that was kind of that, that kind
00:32:13.680 of section of the class.
00:32:15.020 And they'd, they'd like show you a sentence and then they'd say, okay, this word, like
00:32:20.240 give us the full kind of grammatical and, and like poetic significance of, um, of kind
00:32:28.900 of how this word is used by this author in this piece.
00:32:32.920 So, okay.
00:32:34.260 I'm not like, I'm not that good at language.
00:32:37.160 Um, so the way that I did that class was I basically just sat and studied word by word,
00:32:44.740 the like historical significance of, of each word over like tens of pages, um, you know,
00:32:51.700 preparing for these exams.
00:32:53.340 And okay.
00:32:53.920 I don't remember that much of that at this point, right?
00:32:57.080 I mean, there were a few quotes that I think are pretty good from some of those books,
00:32:59.520 but, um, and I put them on shirts, but, um, but yeah, no, that's a good one.
00:33:05.100 What does that one mean?
00:33:06.040 Let's fall out or whatever from, from many one.
00:33:08.920 I think it's talking about how we come together as, as a, as a people or as states into one
00:33:13.780 union.
00:33:14.220 Oh, that's nice.
00:33:15.880 Um, but no, I, um, but so you did it word by word, but it still helped you.
00:33:21.880 Um, but I, I just think like the lesson from that is, uh, you know, it kind of gave me
00:33:26.900 this confidence that it's like, okay, that was like a crazy thing to do.
00:33:30.340 All right.
00:33:30.740 To have to like go learn what every word's significance, like poetic significance and
00:33:35.520 grammatical structure and all that is.
00:33:37.780 Um, and after I took that class, I was basically like, I can work hard enough to do anything
00:33:44.380 that I want, right?
00:33:45.600 Because like, I just like fucking learned all these words that don't matter, um, in order
00:33:52.900 to like nail this thing.
00:33:55.400 And like, and I won and, and, and kind of did that and, and got that to, to kind of, and
00:33:59.980 kind of, you know, had some fun doing it.
00:34:02.660 So.
00:34:03.120 Well, it's like you, like you found a, a model kind of.
00:34:05.960 Yeah, exactly.
00:34:06.560 And it was hard, it was a, it was a tough model.
00:34:09.480 Yeah.
00:34:09.780 So, I mean, there's like, there's that, I mean, there's like the math version of that.
00:34:13.160 I mean, you're talking about Alexander Wang on here.
00:34:15.020 It's like, I mean, I did a bunch of, um, the same kind of math competitions that, that
00:34:19.040 he did.
00:34:19.480 I had this like super hardcore math instructor in high school.
00:34:24.100 His name was Zooming Fung and he, um, and he.
00:34:29.080 Chinese guy?
00:34:29.700 Oh yeah.
00:34:30.300 And, um, I think at some point he, he had worked with, um.
00:34:35.620 Zooming Fung.
00:34:36.400 Bring him up, baby.
00:34:37.080 Zooming Fung.
00:34:37.420 Now he is.
00:34:37.960 We want the Fung.
00:34:39.280 You ever hear that song that we used to play that?
00:34:40.820 Well, this guy is just like a badass.
00:34:42.560 And he, he basically was involved in training the, um, the US math Olympiad team for, for a
00:34:49.760 long period of time.
00:34:50.780 And, and he, um, he made a huge impact on, on me kind of growing up and, um, and, and he
00:34:58.640 kind of taught me a little bit about like how I approach problems.
00:35:01.520 He's like, look, you actually, um, it's like, it's like, you kind of have this, it's like,
00:35:07.900 look, you're not that good at math.
00:35:10.980 I'm like, no, but, um, I'm kidding.
00:35:14.080 Um, he, he, he kind of taught me that I, I had this sort of like intuitive ability to
00:35:20.120 have a sense of like what zone the right answer was in.
00:35:23.400 So he's like, I look at your work and you do a bunch of stuff that like doesn't really
00:35:29.520 make sense.
00:35:30.240 But then at the end you come to a conclusion and you realize that that conclusion doesn't
00:35:34.020 make sense.
00:35:34.500 And then you kind of check yourself and go back and do it.
00:35:36.820 You keep on doing it until you get the right answer is like, I don't, I don't understand
00:35:40.700 how you like have this intuitive sense for like what the shape of the right answer is,
00:35:44.820 but like, that's really good.
00:35:46.000 As long as you like couple that with working really hard, you're going to be able to kind
00:35:49.640 of succeed and get a lot, um, and get a lot done.
00:35:52.800 Um, but.
00:35:53.360 So that gives you a map of how to navigate yourself, like your intuition with your hard
00:35:56.820 work.
00:35:57.040 Yeah.
00:35:57.460 So, so I don't know.
00:35:58.460 So, okay.
00:35:58.840 Like, does the specific, like, you know, high dimensional geometry or whatever the thing is that we
00:36:06.120 were working on together?
00:36:07.000 Like, do I do any of that today?
00:36:09.100 No, I don't remember any of that stuff, but like, but I think it's like, like you have
00:36:13.140 some good teachers who teach you how to think and how to work hard and like that stays with
00:36:20.000 you forever.
00:36:20.680 And I think that that's like some of the stuff that, um, I mean, I think it builds confidence
00:36:27.260 that like teaches you how you approach problems.
00:36:29.180 Then you just get better and better at some things and you, you build confidence.
00:36:32.680 And so that's the value you're saying and seeing in college.
00:36:35.160 Like those are some of the other values you might not see, like, oh yeah, I might be
00:36:39.280 not be able to be seen on paper, but the, but the value of that.
00:36:42.620 Yeah.
00:36:42.900 That's a good point, man.
00:36:43.760 It's like, yeah, I remember like some of my favorite people still to this day in my life
00:36:47.700 have been some of my teachers that challenged me or that believed in me or that set a certain
00:36:52.500 like, man, I admire the way you do this or this, you know?
00:36:55.720 And, um, and it really encourages you and wants kind of plants a lot of seeds to make you
00:37:00.220 want to do those things so much more.
00:37:01.660 Yeah.
00:37:02.220 And they don't have to be teachers, although obviously, you know, that's mentors.
00:37:05.860 I think a lot of, a lot of these days, a lot of people get their mentorship probably, I
00:37:09.040 would guess from teachers, unless they play a sport, you know?
00:37:11.320 Yeah.
00:37:11.540 I mean, I think, you know, the main way that you learn those from colleagues, right?
00:37:15.240 It's like people you, you work with.
00:37:17.140 Um, you know, and I, I remember, you know, growing up, I, I was like, I was really into computers
00:37:23.120 and like, when'd you get your first computer, you think?
00:37:26.660 Um, so my family had a computer probably when I was like eight or nine.
00:37:33.280 And then I think I probably got one.
00:37:35.780 And when you being there on that thing all the time, just riding the keys.
00:37:38.360 So my, my dad's a dentist and, and like, he's like, I just got a tooth yesterday.
00:37:42.640 Yeah.
00:37:42.700 Yeah.
00:37:42.860 How's that?
00:37:43.300 How's that?
00:37:43.800 There you go.
00:37:44.500 I just got a, a half of a, uh, got to see Dr. Zuckerberg.
00:37:47.340 Really?
00:37:48.420 Yeah.
00:37:48.780 It's, um, he'll, he'll take care of you.
00:37:51.220 Um, did your parents give you love or just sonic care?
00:37:54.860 Uh, that's just an old dentist joke.
00:37:56.760 Um, glad you got that taken care of.
00:37:58.240 Yeah.
00:37:58.460 I feel a lot better about it.
00:37:59.420 Yeah.
00:37:59.800 But, um, what were we talking about?
00:38:01.300 Oh, I was just talking about how my, my dad was the type of dentist who like was really
00:38:06.420 into technology, right?
00:38:07.580 So whenever like a new laser thing came out to like drill your teeth better, like he was
00:38:12.360 on that first.
00:38:13.220 Right.
00:38:13.580 And, and, and that was kind of cool to, to be around.
00:38:16.820 Right.
00:38:17.180 And it's, and, and kind of see, like he, he clearly, he loved technology.
00:38:21.760 Um, and I think I sort of got introduced to a bunch of stuff through that.
00:38:25.780 So like he had like in his dental office, there were like a bunch of different, um,
00:38:30.940 you know, operate, operatory rooms, I guess.
00:38:33.500 And, and they each had computers and I was like, all right, you know, like, oh yeah.
00:38:38.440 It feels like the future in there.
00:38:39.720 Yeah.
00:38:39.940 It's like, you, it's like, but it was still, you know, he'd still like go between them
00:38:44.080 to talk to the different people.
00:38:45.200 I was like, you need like a chat app.
00:38:47.660 So you can just like send messages to everyone across the, the dental office.
00:38:51.120 So I like wrote that for him and it's like, all right, that's cool.
00:38:53.700 Um, so you were codes, you were driving, you were putting coding into like dental work.
00:38:58.040 You were, you were already thinking of how to connect.
00:38:59.900 I just like making stuff.
00:39:01.420 Yeah.
00:39:01.760 So yeah, yeah, yeah.
00:39:02.900 I mean, that's sort of been the theme for me is the intersection between like computing
00:39:08.240 technology on the one hand and like people and connecting people.
00:39:13.860 Um, you know, when I was in college, I wasn't there for long, but I mean, I was technically
00:39:17.580 a psychology major.
00:39:18.880 I mean, I took a bunch of computer science and math classes too, but.
00:39:22.400 Dude, if somebody even went to Harvard for four days, dude, I would hire him to be my therapist
00:39:27.880 or whatever.
00:39:28.560 Yeah, I wouldn't really, I don't know.
00:39:32.040 I think I'd rather have someone who I definitely subscribe to the theory of like, you want
00:39:37.860 people who did well at whatever they did, not just like who have some random credential.
00:39:42.080 So, um, so yeah, I guess, but wrong from, if you heard Harvard, it was like somebody had
00:39:46.660 gone to like Mars or something.
00:39:48.560 It was very, you know?
00:39:49.960 Yeah, no, I get it.
00:39:50.920 So, but I think so.
00:39:51.700 It kind of differs from place to place.
00:39:53.520 Whenever you drop, whenever you, did you have to tell your parents who were dropping out
00:39:56.760 of college?
00:39:57.100 Was that a crazy day?
00:39:58.400 Did you DM them or would you do it?
00:40:01.020 So, I mean, I'd already started Facebook and Harvard had this nice policy where you,
00:40:05.740 you didn't have to make a hard decision to drop out.
00:40:08.340 You could just postpone.
00:40:09.360 So I just didn't go back for the next term.
00:40:11.140 But later my mom told me that she always knew I wasn't going to finish college.
00:40:17.060 I was like, mom, what the hell does that mean?
00:40:19.520 But, um, no, but actually.
00:40:21.660 But that's got to be the worst.
00:40:22.280 So imagine this, okay, say your son or child or daughter or mixed rate or mixed child or
00:40:27.720 whatever.
00:40:28.500 Imagine it's a big test at school, right?
00:40:30.560 You have to hug your kid goodbye.
00:40:32.080 You know, your kid is not going to do good on that test, right?
00:40:35.960 But you still have to stay in there in the kitchen and give them a little bit of chocolate
00:40:38.960 milk and be like, you're going to do good out there today, Benny or whatever his name
00:40:42.600 is, right?
00:40:43.200 But then the second he leaves, you're just like, oh, kid doesn't have a.
00:40:46.880 It's like, Benny's screwed.
00:40:49.240 Yeah, Benny's screwed.
00:40:50.220 Um, no, I don't know.
00:40:53.660 I don't know what it's like.
00:40:55.300 No, but my younger sister.
00:40:57.840 Is our conversation going okay?
00:40:59.080 I feel like it's cool.
00:40:59.680 You?
00:41:00.280 Yeah, no, this is, I mean, it's, it's less random than I expected it to be.
00:41:03.600 You know, it's.
00:41:04.240 It is.
00:41:05.220 I'm just kidding.
00:41:06.500 Yeah.
00:41:06.880 I think it's just hard to talk sometimes to people.
00:41:09.320 No, before I went to college, my, you know, my mom, I think told me that she thought I
00:41:15.320 was going to drop out.
00:41:16.820 My younger sister bet me that she was going to finish college before me.
00:41:20.120 Ooh, I like that attitude.
00:41:21.480 And, and I was like, no, I'm going to get a degree.
00:41:23.740 And they all came true.
00:41:24.940 I dropped out.
00:41:25.740 My younger sister finished college and then I got an honorary degree.
00:41:28.060 So, you know, bonus.
00:41:29.140 Oh, you got a, they gave you that.
00:41:30.380 I know.
00:41:30.620 Yeah.
00:41:31.320 Oh, dang.
00:41:31.880 Do you even get your GED at all either?
00:41:33.800 Uh, no, I did finish.
00:41:34.760 I did finish high school.
00:41:35.980 You did?
00:41:36.680 GED is high school, right?
00:41:37.780 Is that?
00:41:38.280 I don't know.
00:41:38.620 I think they make a GED premiere now.
00:41:40.880 I don't know.
00:41:41.580 I don't know.
00:41:41.960 Anyone?
00:41:43.000 No?
00:41:43.620 No?
00:41:44.040 GED is high school equivalent.
00:41:45.560 All right.
00:41:46.060 Thank you.
00:41:47.220 Um, I got a real, I got a, I got a high school diploma.
00:41:49.800 You got a high school diploma.
00:41:50.420 That's my, that's my education.
00:41:51.860 Dude, that's 94% of our country.
00:41:54.680 So you're in good hands, brother.
00:41:57.160 You're right there with some of the greats.
00:41:58.560 Um, so when you started, when you first started, let me think of, I want to think, oh, where
00:42:04.300 was your first date that you took, uh, Priscilla to?
00:42:06.360 Do you remember?
00:42:07.560 Yeah.
00:42:08.120 For, for Priscilla, that was, uh, we went to this place, Burdick's.
00:42:10.900 It was, um.
00:42:12.100 Burdick's?
00:42:12.700 Burdick's.
00:42:13.160 L.A. Burdick's.
00:42:13.900 This, uh, like chocolate place, hot chocolate.
00:42:16.780 Um, that was, that was pretty good.
00:42:19.460 Um.
00:42:19.980 And was it just like y'all went and sat down or you, is it like a restaurant or just a place
00:42:24.080 to get like that?
00:42:24.380 Yeah, it's like a little coffee place, little like hot chocolate type coffee place.
00:42:27.800 It's, um, yeah, no, it's nice.
00:42:30.380 They, they, they make good chocolates.
00:42:31.520 They make these little mice things.
00:42:32.880 Ooh, yeah.
00:42:33.500 Yeah, chocolate mice.
00:42:34.700 So now every year on our anniversary, we get some mice.
00:42:39.420 Oh, yeah.
00:42:40.620 Good deal.
00:42:41.640 Hell yeah.
00:42:42.340 Hold on.
00:42:42.920 Nothing like a couple of tech emperors are enjoying a couple of mice.
00:42:45.840 You feel me?
00:42:47.140 I think that's very normal.
00:42:51.080 That's very normal to me, dude.
00:42:53.760 Um.
00:42:55.220 Sorry, that just made me laugh.
00:42:56.780 I'm sorry, Mark.
00:42:58.700 That just made me freaking laugh.
00:43:00.520 It's just crazy to think.
00:43:02.840 Well, chocolate mice.
00:43:04.900 Yes.
00:43:05.440 Chocolate mice.
00:43:06.000 Yeah.
00:43:06.180 Yeah.
00:43:06.440 And that, yeah.
00:43:06.920 And we'll put that in the notes, in the notes.
00:43:08.680 Um.
00:43:09.020 Yeah, no, it's not.
00:43:09.820 I feel like when you go like full Roman Empire, it's like, give me, give me the live mice.
00:43:18.200 Give me the live mice.
00:43:20.720 And then one mouse shows up and he has like a bad leg.
00:43:23.500 You're like, all of them fully live.
00:43:27.600 Dude, if somebody sees a mouse with a bad leg, nobody even cares.
00:43:30.560 That's the saddest thing about some mice.
00:43:31.880 But actually, I used to sell hamsters.
00:43:33.120 My first job was selling hamsters growing up.
00:43:35.280 And a lot of that market, they brought in these Russian hamsters, right?
00:43:39.200 Uh-huh.
00:43:39.580 And it took away a lot of the American market.
00:43:41.900 And the Russian ones, they're called like the Roborovskys.
00:43:45.400 Bring up a couple of Roborovskys, man.
00:43:47.420 And they were, uh, they, they, they took away like the fluffy kind of American ham.
00:43:55.240 And they.
00:43:56.820 I mean, they're pretty cute.
00:43:58.020 They are cute.
00:43:58.820 But the ones that we were getting, a lot of them were from Russia.
00:44:01.360 Put from Russia.
00:44:02.600 Put, uh, Roborovsky hamsters from Russia.
00:44:06.040 Uh, with a small white ones with kind of the red eyes.
00:44:11.040 Yeah, they, they really, I think they, um, they put some visual effects on some of their eyes.
00:44:16.500 There you go.
00:44:17.540 These were really not helping people feel good.
00:44:20.020 Yeah, no, that's like, that's like a vampire.
00:44:22.060 It was a little bit much.
00:44:23.180 It was a lot for some of the children.
00:44:24.620 You know, so one of my daughters is really into hedgehogs.
00:44:27.920 So we, we found, um, we were in like, we took them to Japan.
00:44:32.880 And there's this hedgehog cafe that you can just go in and they can just like play with the hedgehogs.
00:44:37.280 I've never seen a hedgehog.
00:44:38.240 Yeah, no, it's, it's, uh, they're, they're pretty cute.
00:44:40.980 It's kind of looks like that, but less red eyes.
00:44:43.700 But, um, yeah, no, shorty bae.
00:44:47.180 Yeah.
00:44:47.940 She's out there.
00:44:48.780 Those are beautiful.
00:44:49.660 Now those, the, the idea of cafes where you can hang out with animals.
00:44:53.880 Um, Priscilla was telling me that she took the kids to this cat cafe and like, it's like my daughter was like hanging out with the cats.
00:45:02.600 And it was like, couldn't find a cat that she liked.
00:45:05.620 And then like finally found a cat that she was into.
00:45:07.820 And then like, fine, just as she was sitting down, like hanging out with the cat, someone came in and was like, oh, this cat has been adopted.
00:45:14.220 And took, came and dicked.
00:45:16.400 Oh, just like that issue that's going on with that guy.
00:45:19.360 They deported El Salvador right now.
00:45:21.000 It sounds like very, I don't know if it's similar to that, but it sounds like there's like a lot.
00:45:24.600 That seems completely different.
00:45:25.620 Yeah, you're right.
00:45:26.160 Absolutely a completely different thing.
00:45:27.700 You're right.
00:45:28.420 That's, um.
00:45:29.420 Yeah, you're right.
00:45:30.520 It's not the same fairy tale.
00:45:31.860 I don't know how either one of those fairy tale ends, but that's so sad.
00:45:35.180 Well, the cat at least got a home.
00:45:36.180 Yeah, the cat got a home.
00:45:37.420 And this, I think that guy will get home.
00:45:38.740 I don't know a lot about it.
00:45:39.680 It's just like spawned me on that.
00:45:41.040 That could be something like that.
00:45:42.440 Um, do you, you have how many daughters?
00:45:45.240 You have a.
00:45:45.740 Three daughters.
00:45:46.260 Three daughters.
00:45:46.980 Yeah.
00:45:47.240 You had three sisters.
00:45:48.300 I did.
00:45:48.860 Yeah.
00:45:49.080 So you're.
00:45:49.700 Kind of surrounded by girls.
00:45:51.060 God.
00:45:51.540 Yeah.
00:45:52.480 And did you, what's it like, like bedtime with your daughters at night?
00:45:55.760 Like, what is that like at, at the Zuckerberg's home?
00:45:58.320 Um, yeah, I mean, it's just, it's like, so.
00:46:00.860 So, because I'm so busy during the day, I try to make it so that, like the time that
00:46:05.020 I know I'm going to hang out with the kids at night, I try to spend like half an hour
00:46:08.020 with each of them at night.
00:46:09.140 And, and they're different ages.
00:46:10.320 So we got like a nine, a seven year old, a two year old.
00:46:13.020 Oh, wow.
00:46:14.060 Yeah.
00:46:14.400 Fun, huh?
00:46:15.300 Yeah.
00:46:15.500 So bedtime.
00:46:16.540 I don't know.
00:46:16.880 I mean, the kids are all crazy in different ways and, and we've just kind of like try
00:46:20.860 to connect with them on, on whatever they want.
00:46:23.920 Um, like, what are they like to, or if it's not, the two year old is just like starting
00:46:29.540 to learn how to speak.
00:46:31.580 And she has like code, which would have been crazy.
00:46:34.940 No, well, well, she's like, yeah, no, the two year old has very strong opinions and I
00:46:39.780 think it's going to be very interesting.
00:46:41.220 Really?
00:46:41.580 Um, yeah.
00:46:43.560 Um, yeah, but it's like, let me, let me think about what, I mean, she's, um, I don't know.
00:46:53.240 I need to think about something funny that she's doing, but while, while, um, while I'm thinking
00:46:57.000 about that, the, uh, the seven year old is just like purely generative, constantly just
00:47:04.020 creating things.
00:47:05.540 Um, you know, she's the one who it's like, we'll do like 3d printing.
00:47:09.560 She'll create like 3d worlds in this like horizon metaverse system that we're building.
00:47:16.260 Um, she codes, she writes books.
00:47:19.420 She, um, she basically like makes music.
00:47:23.220 Um, she is, she's seven years old.
00:47:26.480 She's seven.
00:47:27.240 Yeah.
00:47:27.820 Seeing her has really like, I think helped me understand myself in a way because it's
00:47:33.660 like, it's kind of like, okay, like why do I just keep like building stuff?
00:47:37.060 Like, why do I care so much about creating stuff?
00:47:39.080 And it's like, I don't know.
00:47:39.820 I think some people just like have a thing in them where it's like, they have to create
00:47:43.300 stuff.
00:47:43.580 Like the stuff just like comes out of them.
00:47:45.100 They're constantly generating things and like, and, and kind of producing stuff.
00:47:49.940 And, and you notice that in her.
00:47:51.520 She's like that.
00:47:52.300 Yeah.
00:47:52.780 Yeah.
00:47:52.940 I think, yeah.
00:47:54.420 I mean, I think before, before kind of, she started growing up and doing that Priscilla
00:47:58.780 was, and it was just like, why don't you just like relax?
00:48:02.340 Right.
00:48:02.600 It's like, you've done, you've built enough things, right?
00:48:04.480 It's like, your company is good.
00:48:05.600 Like you can just chill.
00:48:06.960 Then, and now, now I think after seeing, um, this one, she's just like, okay, no, I get
00:48:13.660 it.
00:48:13.900 Yeah.
00:48:14.100 You, yeah, you got, you got, you have the same thing.
00:48:15.720 You have what she has, right?
00:48:16.960 It's like, you're just constantly creating stuff.
00:48:19.240 She's like, dude, you're Thomas medicine.
00:48:21.700 There you go.
00:48:22.440 There you go.
00:48:22.800 I like that.
00:48:23.440 Yeah.
00:48:23.700 No, it's a, yeah.
00:48:24.740 Thank you, dude.
00:48:25.920 Every now and then.
00:48:26.320 And then, you know, our oldest daughter is just like, is.
00:48:29.020 And the oldest one is nine, you said?
00:48:30.180 Yeah.
00:48:30.340 She's nine.
00:48:30.740 And, and she's getting up there a little bit in kids age.
00:48:33.440 So she's getting into like the zone where she's like, she competes.
00:48:37.680 She's doing like history B and like competing in math and, um, and really wants to like understand
00:48:44.660 the world.
00:48:45.140 So, um, so my, my activities with her is like, we'll usually like go through the news and
00:48:51.360 I'll find, she'll find like one thing, we'll pick out a story and then we'll just like talk
00:48:56.480 about it and it's, it's really interesting because like, it hadn't occurred to me before
00:49:01.640 how much in order to understand like technology, you need to really understand like government
00:49:08.960 and civics and politics and law and like all of these different things.
00:49:13.780 So trying to explain to like a child who hasn't thought much about these things, um, is it's
00:49:23.360 both very fascinating for, for me?
00:49:25.800 Um, but it's just been like a cool bonding experience of like every, and you, you start
00:49:30.460 off and it's like a really basic understanding, but then after you've done it for like a year
00:49:33.480 or two that like, okay, she has like a very good understanding of like the tech industry
00:49:38.380 and the world and all this stuff.
00:49:40.540 And it's, it's, um, it's pretty interesting to, to talk through.
00:49:43.920 Well, I think it's inspiring just to even parents to hear like, what can you talk to your
00:49:47.520 kid about?
00:49:48.120 You know what I'm saying?
00:49:48.580 Like if your kid's 13 and you're still reading embarrassing bears or whatever, some of that's
00:49:53.760 on you.
00:49:54.200 Like you got to evolve some of the curriculum for your child, you know?
00:49:58.580 Um, because yeah, a lot of parents think you probably can't kind of build these worlds in
00:50:02.940 a child's head, but if you start with the basic blocks, then you kind of can, you know,
00:50:07.280 and help them, uh, get bigger ideas about things.
00:50:09.700 I think that's cool.
00:50:10.600 I think there's nothing more important than how a parent communicates with their child, you
00:50:14.360 know, and what they communicate to them.
00:50:15.860 Um, because so often we just expect like teachers and different curriculum out in the world or
00:50:20.500 just the world to do that for our children.
00:50:22.800 But man, I think the, the biggest faucet for a child is the, is the parent, you know?
00:50:27.780 Um, that's pretty cool.
00:50:29.740 And the little one is just a little kid.
00:50:31.980 She's two.
00:50:32.420 We got, we got a, we got a bonus.
00:50:34.980 Yeah.
00:50:35.520 So it's, um, no, she's, she's good, but I mean, obviously very, she's just obsessed with
00:50:41.140 her older sisters.
00:50:42.140 Like, so one day she just decided like, I am not a baby and we're like, oh, you're
00:50:47.900 the family baby.
00:50:48.660 She, I'm, I'm a big girl.
00:50:50.040 It's like, that's like a core part of her identity is she's a big girl.
00:50:53.240 It's like, email me now.
00:50:54.440 Yeah.
00:50:54.660 No, it's like, um, yeah.
00:50:57.180 Email me now.
00:50:57.960 Email me at sucker baby at meta.
00:51:00.400 Meanwhile, she can't even like fully pronounce it.
00:51:02.580 She goes, I'm a bit girl.
00:51:04.380 It's like, okay.
00:51:05.540 It's like, well, maybe you learn how to pronounce it first and then I'll believe you.
00:51:08.400 But, um, but, uh, no, she's like, I weigh six terabytes, you know, like calm down, you
00:51:16.640 know?
00:51:17.600 Yeah.
00:51:18.100 No, she's, um, so funny being like a, like a, and the stuff that we do, it's like, I mean,
00:51:24.420 it's when you're two, you're trying to like teach them like basic stuff.
00:51:28.300 So it's like, we have this book.
00:51:29.080 It's like, how do I feel?
00:51:30.340 Right.
00:51:30.520 And it's just like pictures of other kids and like their facial expressions.
00:51:34.880 And it's like, the only emotion that she identifies with is happy.
00:51:40.940 And so she'll be having like a terrible meltdown and like sobbing and I'm like, Oreo, how, how
00:51:46.940 you feel?
00:51:47.460 She goes, happy.
00:51:49.340 And it's like, no, you don't have to, you don't have to say happy all the time.
00:51:53.020 It's like, it's normal.
00:51:54.240 Well, like in life, it's like, we have all these different emotions.
00:51:57.780 Like you, like it's, it's important to understand when you're sad or, or like angry or frustrated
00:52:03.220 or something, but, but that's hilarious.
00:52:05.920 So we're still working on that.
00:52:07.000 We're working on, we're, we're still on the remedial emotional, um, emotions phase.
00:52:11.880 She'll be like crying.
00:52:12.860 Like, how do you, but she knows that's the one.
00:52:14.480 She's like, yeah, no, it's like Oreos, her name Aurelia, but we call her Oreo, which I'm
00:52:19.860 sure we're going to regret when, um, when, uh, when she's older.
00:52:23.520 Well, it's cute.
00:52:24.680 I think Oreos is a cute girl.
00:52:25.560 She'll probably marry a basketball player.
00:52:26.940 I'm guessing.
00:52:27.520 Who knows, Mark?
00:52:28.480 Who knows what'll happen?
00:52:29.780 It could go, you know, Oreo Zuckerberg, dude, that's when you take, um, that's when you
00:52:36.860 like, that's when you get into the food industry.
00:52:39.260 Yeah.
00:52:39.680 Oreo Zuckerberg.
00:52:41.220 I don't know.
00:52:41.760 She's, she, she, I think could be.
00:52:44.480 The sugar empress.
00:52:46.220 Well, see, you know.
00:52:46.800 Of Menlo Park.
00:52:47.780 Kids love like, you know, kids love like, they go through a phase where they love cleaning.
00:52:53.820 And they love like playing cooking, right?
00:52:55.860 It's like, oh, just give me some, like, like a pan or like, I don't know if that's.
00:52:59.340 Yeah.
00:52:59.820 Dad wants a cheeseburger.
00:53:01.160 I think a lot of kids are like that.
00:53:01.800 Yeah.
00:53:02.500 She's just like, give me a desk.
00:53:05.400 Like, I want, like, I want to work like dad does.
00:53:07.540 Like, give me a desk.
00:53:08.840 And like, like, I don't want to do this, like cooking shit.
00:53:12.480 Like, it's like.
00:53:14.260 Yeah.
00:53:14.460 We want to own a Popeye's.
00:53:15.980 And she's like, but I like that though.
00:53:19.320 She has like a play, you know.
00:53:20.980 No, she's, she's a bit girl.
00:53:22.400 Yeah.
00:53:22.800 I like that.
00:53:25.460 I love that idea that some child somewhere.
00:53:28.700 This is going to make my day so much better.
00:53:30.420 Just knowing that a child is just.
00:53:32.220 There's a child behind a desk.
00:53:33.620 Just like demanding.
00:53:34.940 It's like, like, yeah.
00:53:36.560 Yeah.
00:53:36.920 There's not enough bandwidth in this crib.
00:53:38.840 Yeah.
00:53:39.080 Not enough bandwidth to the crib.
00:53:40.620 Yeah.
00:53:40.820 To pull off what I need to pull off.
00:53:43.120 I know the kids.
00:53:43.940 You know, I don't.
00:53:45.140 Is it fun being a dad?
00:53:46.300 Do you feel like you enjoy it?
00:53:47.700 Does it feel kind of.
00:53:49.480 Is it tough to connect with your children?
00:53:51.540 Does that ever feel like a thing as a dad?
00:53:53.260 With girl daughters?
00:53:54.360 With, or with.
00:53:55.400 Yeah.
00:53:55.980 I don't know.
00:53:56.620 I think it's like, there's probably some things that they connect better with, with mother about.
00:54:02.440 But like, I feel like there's also like, I don't know, there's all these weird dynamics.
00:54:06.000 I know there's some things that they like probably just connect with me better about where it's, you know, it's, it's just a different dynamic.
00:54:10.820 With, with, um, father.
00:54:13.420 Right.
00:54:13.700 So.
00:54:14.280 And what do they call you?
00:54:15.360 Do they call you all dad, father, mother?
00:54:17.440 Dad.
00:54:17.900 Although sometimes they're, they're being disobedient, Mark.
00:54:22.240 And it's like, no, dad is a, dad is a tighter, a title, an honorific title that I've earned.
00:54:29.680 Mark.
00:54:30.080 And they're like, what is going on?
00:54:32.440 It's like Oreo.
00:54:33.120 That is bullshit.
00:54:33.840 Get back in your crib.
00:54:35.720 I'll see you in court.
00:54:37.180 It's, these diapers are too tight.
00:54:42.560 Tell my lawyer.
00:54:43.740 That's hilarious.
00:54:45.100 Um, that's crazy.
00:54:46.740 I don't know.
00:54:46.840 I think like, it's just good for them to, to have good role models.
00:54:51.180 Right.
00:54:51.580 You know, part of the way that I, I feel like when kids grow up, they either end up wanting
00:54:58.900 to like marry people who are the opposite of their parents.
00:55:02.760 If it was a bad experience or people who they like think of as sort of like, oh, this was
00:55:07.380 a good role model.
00:55:08.180 Right.
00:55:08.560 And so I feel like as long as they look up to us, that's like, that's kind of the, that's,
00:55:15.340 that's, you want to like set a good example.
00:55:17.060 Right.
00:55:17.340 So, um, but I don't know, I, I kind of, for me, we talked about this a little bit with,
00:55:23.320 um, with the fighting stuff early on.
00:55:27.060 But for me, it's always just been important that it's like, I don't just want to like
00:55:32.260 be a person who like sits and works all the time.
00:55:34.460 I think like, you know, it's, we're not like meant to just sit at a desk all day long and
00:55:39.240 one day Oreo, we'll learn that.
00:55:40.760 Um, but, um, and I think like a lot of life is like, you move around, you like, you know,
00:55:46.320 it's like, we're like meant to be active and do stuff.
00:55:50.280 And I think that that's a big part of it.
00:55:51.900 And I try to like, it's important to me that, you know, the kids get that too.
00:55:55.620 And the kids are very active.
00:55:57.900 Um, do your kids have a lot of, um, like screen time?
00:56:02.700 Like how much screen time do you allow your kids?
00:56:04.960 Yeah.
00:56:05.320 Um, it's different for the different ones.
00:56:07.460 We don't just like let them do whatever, but I actually like want them to be fluent with
00:56:13.800 this stuff.
00:56:14.400 And that kind of like we talked about earlier, um, you know, I want them to learn how to
00:56:19.000 code, how to use technology.
00:56:21.360 Um, I think it's important because a lot of socialization, you know, obviously like
00:56:26.640 happens online at this point, like people need to get used to the norms and stuff around
00:56:30.640 that.
00:56:30.880 So, I mean, they're not on, they're still too young to be using like social media, but,
00:56:35.140 uh, but they have messenger kids.
00:56:36.800 You know, we, we, um, make it so that they can video chat and, and chat with their friends.
00:56:42.920 And, you know, we'll, we'll obviously monitor to make sure that they're, um, that they're just
00:56:47.640 connecting with the people who, who we think that they should, but like, I think it's actually
00:56:51.160 good.
00:56:51.540 I think people need to kind of grow up, um, I don't need to is strong, but I think is,
00:56:57.420 um, I think it's good if you have an engaged parent and, and they, um, and, and as a child,
00:57:04.900 you learn up, you, you, you kind of grow up, um, learning how to use a bunch of this stuff.
00:57:08.800 So I think that that's, I think that's all good.
00:57:10.460 I want the kids, um, to the extent that they're interested in it to learn how to code, learn
00:57:15.720 how to create stuff, whether it's in like horizon or VR tools, or they play, you know,
00:57:20.840 Roblox and Minecraft and stuff like that.
00:57:23.060 I think that's good.
00:57:23.840 And then, you know, there's a lot of educational study type tools that, um, you know, when,
00:57:29.680 um, when our daughter's studying for her competitions or whatever, she can, she can kind of, uh, make
00:57:35.060 a lot of progress on that.
00:57:36.020 So I think that that stuff is good, but we're not just like letting them, um, just kind of
00:57:40.900 sit and watch stuff all day long.
00:57:43.760 So you think, unless you're on a plane, then you do whatever you need to do to get through
00:57:48.640 that flight.
00:57:49.240 Yeah.
00:57:50.980 Thank you, dude.
00:57:52.020 Somebody else needs to say that.
00:57:53.340 Yeah.
00:57:53.620 No, it's, um, where's the, I want to, when do we come out with that pacifier that really
00:57:59.320 shuts these kids down for a couple hours?
00:58:02.560 I'm not saying it has to be anything crazy, nothing illegal, but we need, we need a high
00:58:07.900 voltage pacifier back in society, Mark.
00:58:10.620 Oh man.
00:58:11.520 Um, give us something, brother.
00:58:12.880 Yeah.
00:58:13.640 These kids are screaming.
00:58:15.180 They, they sometimes do.
00:58:16.940 Oh, the craziest thing happened to me.
00:58:19.840 One time I'm on a plane, I'm dreaming.
00:58:22.900 Right.
00:58:23.280 And I'm like, man, I'm really having some nice dreams because I never have good sleep
00:58:27.320 or anything.
00:58:27.820 So every now and then I come to the surface of my dreams and I'm like, wow, I'm still
00:58:31.220 dreaming.
00:58:31.560 And I go back underwater.
00:58:32.520 Um, but I heard a ukulele playing, right?
00:58:36.180 Ukulele.
00:58:36.640 And at some point I wake up and there is a effing child or child as some people call them
00:58:43.500 playing a ukulele with his parents on this plane.
00:58:49.160 And I walked up and I put my hand on it in front of the parents.
00:58:52.540 I think I was still groggy from being asleep.
00:58:54.060 And I was like, we can't do this today.
00:58:57.200 It's like the ukuleles.
00:58:58.940 It's, uh, it's, uh, it's a lot.
00:59:01.220 It's time out for you.
00:59:02.000 Yeah, it is.
00:59:03.360 We're shutting down this little hand Hawaii you got going on right here, brother.
00:59:06.640 I mean, it probably took a lot of restraint to not just pick up the ukulele and smash
00:59:09.900 it.
00:59:10.080 Oh, it really did.
00:59:11.400 Yeah.
00:59:11.840 I mean, it's, yeah.
00:59:13.020 It really did.
00:59:13.680 So I just said, we're not doing this today in a tone that was-
00:59:17.200 That seems, seems stern for a child.
00:59:19.780 It was.
00:59:20.400 Yeah.
00:59:20.520 And Bojo's parents are right there.
00:59:21.520 Especially for a child who you, is not your child, who you don't know.
00:59:25.480 Very fair.
00:59:26.420 That's the type of thing that you could cause a scene.
00:59:28.100 Very fair.
00:59:28.820 I got lucky.
00:59:29.620 The parents were-
00:59:31.400 They agreed.
00:59:32.000 They were probably pissed about the ukulele, too.
00:59:33.960 But they were afraid to tell their kids something.
00:59:35.680 Yeah.
00:59:35.900 It's like, so you did them a favor.
00:59:37.300 Yeah.
00:59:37.540 It's like, hey, Oreo sometimes, business is closed.
00:59:40.180 Yeah.
00:59:40.420 No, it's like, yeah.
00:59:41.280 Working hours are over for today, okay?
00:59:44.900 Well, you'll be back in the office tomorrow.
00:59:46.920 Like, sometimes you just have to, you know, but sometimes-
00:59:48.760 So I agree.
00:59:49.940 Please do whatever you can to stop these kids from being on planes or screaming on planes.
00:59:57.120 Because we are looking-
00:59:58.280 That's one thing we are all looking to you for.
01:00:00.360 Well, maybe one day we'll deliver that.
01:00:03.080 Nobody needs help spending.
01:00:05.680 The Chevrolet employee pricing event is on now.
01:00:08.780 Get a big cash purchase discount of up to $11,300 on the 2025 Chevrolet Silverado LD ZR2 and Silverado HD ZR2.
01:00:19.100 With a factory-installed lift kit and Multimatic DSS-V dampers on both the Silverado LD and HD ZR2,
01:00:25.800 you'll have all the capability you need to leave the asphalt behind.
01:00:29.880 Hurry in.
01:00:30.700 Employee pricing is on for a limited time.
01:00:32.520 Visit your local Chevrolet dealer for details.
01:00:35.620 ...their money.
01:00:37.320 Sometimes it feels like the whole world is just trying to spend your money.
01:00:41.460 We're going to spend your money.
01:00:43.940 But people don't need help spending their money.
01:00:46.340 People need help growing their money.
01:00:48.980 That's why there's Acorns.
01:00:51.420 In a world that wants to spend your money,
01:00:53.620 Acorns hopes to grow your money into more of your money with simple saving and investing.
01:01:00.520 You don't need to be rich.
01:01:02.660 Acorns lets you get started with the spare money you've got right now,
01:01:05.760 even if all you've got is spare change.
01:01:08.840 And you don't need to be an expert.
01:01:10.980 Acorns recommends a diversified portfolio that can help you weather all of the markets' ups and downs.
01:01:17.220 Acorns even works for children.
01:01:19.920 I recently got my niece and nephews set up on Acorns so they can see how savings works.
01:01:25.820 Sign up now and join the over 14 million all-time customers
01:01:29.860 who have already saved and invested over $25 billion with Acorns.
01:01:35.920 Plus, Acorns will boost your new account with a $20 bonus investment.
01:01:40.280 Offer available at acorns.com slash T-H-E-O.
01:01:45.140 That's A-C-O-R-N-S dot com slash T-H-E-O
01:01:49.940 to get your $20 bonus investment today.
01:01:52.480 Paid non-client endorsement compensation provides incentive to positively promote Acorns.
01:01:55.380 Investing involves risk.
01:01:56.160 Acorns Advisors LLC and SEC Registered Investment Advisor.
01:01:58.180 View important disclosures at acorns.com slash T-H-E-O.
01:02:01.760 Podcasting, it felt like everything you had to do it yourself.
01:02:04.900 You had to make sure your backdrop and your curtain was set up and clean.
01:02:09.260 And you had to make sure the thing was in focus and then sit down and get it back in focus.
01:02:14.040 It was just, and you had to do the editing.
01:02:16.500 It wasn't overwhelming, but it was a lot, right?
01:02:18.940 That's all I'm saying.
01:02:19.720 I'm not complaining, but that's how it is until you get some help.
01:02:25.320 For millions of businesses, the tool that helps is Shopify.
01:02:30.000 Shopify is the commerce platform behind millions of businesses around the world
01:02:35.360 and 10% of all e-commerce in the U.S.
01:02:39.420 From household names like Mattel and Gymshark to brands just getting started.
01:02:44.620 Shopify helps you build a beautiful online store to match your brand style,
01:02:48.340 turn your big business idea into with Shopify on your side.
01:02:55.380 Sign up for your $1 per month trial and start selling today at Shopify.com slash T-H-E-O.
01:03:03.760 Go to Shopify.com slash Theo.
01:03:07.660 Shopify.com slash Theo.
01:03:10.140 I know it's May pretty much almost it's before May, but I'm still recovering from the holidays.
01:03:17.840 I know it sounds crazy, but I know I'm not the only one life in general can be chaotic,
01:03:24.760 but if you're in charge of order fulfillment for an e-commerce business, you know that that's its own special kind of chaos.
01:03:33.700 But with ShipStation, you can count on your day-to-day remaining calm.
01:03:40.340 ShipStation saves hours and money every month by shipping from all your stores with one login,
01:03:46.500 automating repetitive tasks, and finding the best rates among all the global carriers.
01:03:52.380 You'll never need to upgrade.
01:03:54.740 ShipStation grows with your business no matter how big it gets.
01:03:58.220 Calm the chaos of order fulfillment with the shipping software that delivers.
01:04:03.780 Switch to ShipStation today.
01:04:06.600 Go to ShipStation.com and use code Theo to sign up for your free trial.
01:04:12.500 That's S-H-I-P-S-T-A-T-I-O-N.com and use code Theo.
01:04:18.680 That's ShipStation.com, code Theo.
01:04:22.800 I do think a future version of the glasses will get there.
01:04:26.020 I think you had a chance to play around with this a bit.
01:04:28.460 Yeah, I did.
01:04:29.080 Are those the ones you have on right now or no?
01:04:30.780 No, this is sort of, I mean, these are the ones that are available today.
01:04:33.700 These are like REI glasses.
01:04:35.460 So, I mean, it's, you know, they're glasses.
01:04:37.920 They can, you know, the main thing is they can, you can take photos or videos with them.
01:04:43.700 I love just using them for listening to music and taking phone calls on them because your ears are open, right?
01:04:49.960 So it doesn't like, it doesn't obscure your ability to hear anything else.
01:04:52.720 The audio quality is really good, right?
01:04:55.580 Because it has a microphone, like it's a contact mic that's like basically in the nose pad.
01:05:01.120 Audio quality is great.
01:05:01.720 Yeah.
01:05:01.900 So when I'm like, you can be on a plane and take a phone call and like the other person on the other side can't even hear that you're on a plane.
01:05:09.380 Like it's just like the, you can be in a wind tunnel on it, just like whatever.
01:05:12.200 It's the sound quality is amazing.
01:05:13.980 But then the main thing is they're AI glasses.
01:05:19.760 So you can, glasses I think are like the perfect form factor for a device where if you want to have an AI that you let, see what you see, hear what you hear, it can talk to you.
01:05:32.960 You can talk to it throughout the day.
01:05:34.520 I think glasses is, that's like, yeah, if you want to have something that has the same context of the world that you do, that's, it's going to be glasses.
01:05:42.160 And there's like, there's like a billion or 2 billion people in the world who wear glasses already.
01:05:46.920 So to me, the, you know, the chance that we look back like a decade from now and like all those glasses aren't AI glasses by that period, it's kind of like, like obviously all the flip phones were going to become smartphones, right?
01:06:03.200 I mean, that, like that was clearly a thing that was going to happen.
01:06:05.840 I think that's going to happen with glasses too.
01:06:09.820 But the other piece of this is that you're going to get the ability to kind of put holograms in the world, right?
01:06:16.540 So our experience with technology today is, I don't know, it's kind of funny in a way how it's divided where it's like, you, you know, we have the physical world all around us.
01:06:27.060 And then if you want to interact with something digital, you need to like put a screen up, right?
01:06:30.240 So maybe it's like this, you know, you have your, your small glowing rectangle, your phone with you, you know, you, you have like, you know, your screen, if you want to like project something.
01:06:38.660 Yeah, your computer.
01:06:39.920 But I think in the not too distant future, this should be blended together, right?
01:06:46.160 You'll have like the physical world, but all this digital stuff should just basically be holograms.
01:06:51.020 You shouldn't need like a physical screen.
01:06:53.360 Like there's no reason why in the future, you know, you want to have a screen there.
01:06:57.780 You'll just have glasses and that screen will be a hologram.
01:07:00.720 Right.
01:07:00.900 And that's what you see it.
01:07:01.860 That's what you put me in.
01:07:02.940 Yeah.
01:07:03.240 Somebody put me in a day.
01:07:04.320 Yeah.
01:07:04.880 And that's the final season of stranger things back there.
01:07:07.380 I got put in whatever they put on me.
01:07:10.980 It was like lens crafters makes lens crafters look pretty, you know, lame.
01:07:17.120 Well, it's, um, it's different.
01:07:19.340 It's, it was, well, it was crazy.
01:07:20.740 I'll say it was like, can I say what happened on it?
01:07:23.200 Yeah.
01:07:23.400 Yeah.
01:07:23.700 Yeah.
01:07:23.920 It was, so they put it, you put the glasses on and they're, they're, it's like an advancement
01:07:30.320 of the pair that you have on now.
01:07:31.580 Right.
01:07:31.880 So it's like years down the line.
01:07:33.860 How many years down the line do you realistically think that those could, um, I'm hoping that
01:07:38.560 we'll have a version of that as a product in a few years, but it's going four years or
01:07:43.440 eight years, hopefully closer to four, um, or even less, but, um, but it's, I think that
01:07:52.480 there will still be simpler glasses like this and then there will be more complex glasses.
01:07:56.760 Those will be more expensive.
01:07:57.760 There's more technology.
01:07:58.220 What do you want kind of?
01:07:59.340 Yeah.
01:07:59.600 But so some people will want like more tech.
01:08:01.800 They'll want the holograms.
01:08:02.840 Some people just want a simple experience where it's like, all right, I got the AI.
01:08:06.080 I got the ability to listen to music and phone calls and do all that.
01:08:09.660 Um, and then obviously the less tech that you put into them, the thinner they can be,
01:08:14.900 which I mean, some people like bulky glasses.
01:08:17.080 Some people want thin glasses and, uh, yeah, it was fascinating.
01:08:20.760 Yeah.
01:08:21.160 Yeah.
01:08:21.540 They had a thing on my wrist, right?
01:08:23.680 Uh-huh.
01:08:24.120 Uh, like neural interface, a neural interface.
01:08:26.620 Yeah.
01:08:26.940 I mean, you can basically control the glasses with your mind through signals that you're
01:08:32.220 sending from your brain to your hand.
01:08:34.940 Yes.
01:08:35.200 It was fascinating.
01:08:36.120 It was like, I could like, um, touch different nodules that I needed to and stuff or different
01:08:40.740 nodes, whatever it's called.
01:08:41.900 I could look at certain things and that would highlight what it was.
01:08:44.700 And it was like a, it was just a screen in midair and I could walk around the side of
01:08:49.440 the room and then come back and the screen would still be there, but it wasn't really
01:08:52.660 there in real life.
01:08:53.740 It was just there with the glasses on.
01:08:57.340 Yeah.
01:08:57.740 It's a hologram.
01:08:58.440 Yes.
01:08:58.880 And it was, it was crazy, man.
01:09:01.760 I don't know.
01:09:04.120 And then I was like, well, my first thought was like, well, how do you just get people
01:09:06.980 to adapt to this?
01:09:07.920 Cause people aren't just going to go from where we are right now to adapt.
01:09:11.680 And then I realized, okay, there's different stair steps.
01:09:14.180 There's like almost like when you got the first smartphone, like you're saying, and then
01:09:17.320 in advance or the first mobile phone, and then you have the first metaglasses and then
01:09:21.260 it advances and stuff like that.
01:09:22.660 So it was fascinating.
01:09:24.300 There was a part where me and another man who I just met him and we played ping pong
01:09:30.340 and I believe it was an Asian guy.
01:09:31.740 And I don't know if they did that on purpose or not.
01:09:32.940 And I don't know if ping pong is Asian, but it's people think it is.
01:09:36.620 And I started playing with this man back there, you know, and the table, huh?
01:09:41.200 Did you win?
01:09:41.840 Dude, who knows?
01:09:42.800 We're in the future.
01:09:43.240 I don't know if they keep score.
01:09:44.180 I think everybody gets a medal.
01:09:45.200 No, you definitely keep score.
01:09:46.360 Oh, you do?
01:09:47.040 Yeah.
01:09:47.220 And then in our future, not everyone gets a medal.
01:09:49.400 Yeah.
01:09:49.920 I like that.
01:09:50.480 No, it's-
01:09:51.380 Who are you not entertained?
01:09:53.160 Yeah.
01:09:53.460 No, that's more, yeah.
01:09:55.100 Oreo, fire up the grill.
01:09:57.160 Yeah.
01:09:59.320 Loser burns.
01:10:00.880 But no, um, the, uh, but we could play a game of, and it was a not real, it was a ping
01:10:07.480 pong table in front of us that was not real.
01:10:09.260 It was like 3D ping pong.
01:10:10.940 Yes.
01:10:11.200 Yeah.
01:10:11.400 Yeah.
01:10:11.580 Yeah.
01:10:11.680 So it wasn't there.
01:10:12.600 It wasn't there.
01:10:13.800 Uh-huh.
01:10:14.220 Someone could ride their bike through it.
01:10:15.700 Like, your dumb brother could ride his freaking bike.
01:10:18.420 Like, dang it.
01:10:19.400 Yeah.
01:10:19.820 Ricky, bro.
01:10:20.920 You freaking rode right through our net.
01:10:22.460 So, I mean, I think it's an interesting thought experiment how many of the things that we
01:10:27.500 physically have aren't going to need to be there in the future.
01:10:31.320 Right?
01:10:31.520 So pretty much every screen doesn't need to be there, right?
01:10:34.500 It'll just be a hologram.
01:10:36.120 Um, any media, any book that you're playing, any board game, any cards.
01:10:40.640 Those are, yeah.
01:10:41.580 That'd be nice.
01:10:41.960 Because one thing I hate is at the airport, all these TVs are on our air.
01:10:45.180 It's like, everything is so loud now.
01:10:47.120 It's like, can you just make it for you?
01:10:49.540 Just turn it down.
01:10:50.040 Why do we all have to experience this painful noise sometimes?
01:10:55.840 Yeah.
01:10:55.940 It's like, you're driving through a city and there's billboards and nothing is personalized.
01:11:00.740 It's like how TV used to be, right?
01:11:02.640 I mean, in the future, you know, now, like all the stuff that you use on your phone, it's
01:11:06.180 like, you get exactly what you're interested in and it's just a much more, much higher quality
01:11:10.400 experience.
01:11:11.240 But there's all this physical stuff that just stuck, right?
01:11:14.000 And is static.
01:11:14.820 And, um, yeah, I mean, everything I think is going to be able to, um, sorry, not, not
01:11:22.160 everything, but, but it's, I think it's an interesting thought experiment, how much of
01:11:24.660 the stuff that we physically have today that just doesn't actually need to exist in the
01:11:28.940 future.
01:11:29.220 Um, so do you think, um, is social media bad?
01:11:36.720 I mean, I don't think so.
01:11:39.640 Yeah.
01:11:40.040 Maybe that's not the right term.
01:11:41.080 Um, like at a certain point does, yeah.
01:11:47.380 Like how much, you know, there's been like studies on where it's like doom scrolling and
01:11:51.260 stuff like that can lead to depression, that sort of thing.
01:11:53.600 Like, do you, yeah, what do you think about that?
01:11:56.720 Like, yeah.
01:11:57.560 So, so look, I mean, we obviously study this stuff pretty carefully.
01:12:01.160 Um, you guys do.
01:12:02.140 Yeah.
01:12:02.400 I mean, we study it, we work with academics to study it.
01:12:05.260 Um, you know, as you can imagine, there's a lot of like media coverage of this stuff.
01:12:10.020 That's like very, um, sensationalist that tries to like have a skewed point of view.
01:12:16.060 Right.
01:12:16.420 So my understanding of the current state of the research is that there isn't kind of a
01:12:21.160 conclusive, um, finding that this is negative for people's wellbeing.
01:12:26.200 So I think that that's, and in general, you know, some of the stuff that ends up being
01:12:30.580 positive for people is, is building relationships.
01:12:33.320 So there's sort of the media part of social media and there's the social part of social media.
01:12:37.720 I think the, the interacting with people, um, to the extent that that's helping you build
01:12:43.460 good relationships, I mean, friendships and good relationships is one of the things that
01:12:49.460 correlates the most strongly with positive wellbeing and like feeling good about your
01:12:54.080 life and all that.
01:12:55.080 The media stuff.
01:12:56.220 I mean, I think that's more entertaining.
01:12:57.560 You know, I think you can, you know, people want things that are fun, right?
01:13:01.500 People want to be entertained for sure.
01:13:03.040 It doesn't necessarily like correlate with good wellbeing or bad wellbeing.
01:13:06.420 But I guess like the way that I think about this stuff is that it's, um, our modern online
01:13:15.080 environment is, um, it, it's just an environment in which we live in a way that has pros and
01:13:25.240 cons, like whether you live in a city or a rural place, right?
01:13:28.860 It's like, okay, like some people may prefer living in a city.
01:13:32.180 There's like things that are good about a city.
01:13:33.760 There are things that are bad about a city.
01:13:35.340 Okay.
01:13:35.700 Like the fact that we went from being primarily offline as a society to now like this kind
01:13:41.780 of hybrid kind of physical and digital reality, there's some things that are better about
01:13:46.400 that.
01:13:46.720 Some things that are, that are maybe less good.
01:13:48.840 Um, it's not that every single thing improves at each step along the way, but I think overall,
01:13:53.360 um, it's, it's, um, kind of the overall effect is significant improvement.
01:13:59.100 And I, I just don't see a way that people would want to like go back to not having services
01:14:04.620 where they can get the content that they're interested in and where stuff is personalized
01:14:09.140 to them and where they can communicate with the people who they want and don't need to
01:14:12.580 like be constrained to just the people who are physically around them.
01:14:15.320 It's just, I mean, like we're, we're just, you know, we're not going back to that.
01:14:18.580 And if you look throughout history, it's like when people like first built cities, you know,
01:14:24.540 there were all these, um, there's a lot of kind of nostalgia for the, the simpler life
01:14:30.260 and things like that.
01:14:31.160 And it's like, okay, yeah.
01:14:32.180 And, and yeah, sure.
01:14:34.560 Maybe, maybe cities aren't better in every single way.
01:14:36.680 And some people might prefer to kind of only go sometimes or whatever, but, but I, I just
01:14:41.300 think it's like, we're, we're building more and more capabilities as a society.
01:14:45.000 And I think that that's sort of, I just think about this stuff more as like an environment
01:14:48.360 in which we live.
01:14:49.800 So then it's more like if, yeah, like sometimes I think like, cause I've been thinking a lot
01:14:53.740 about this.
01:14:54.160 I think a lot of people have, it's like, well, we're moving into such a, like technology is advancing
01:14:58.520 so much faster, right?
01:14:59.820 It's accelerating the advancement speed.
01:15:02.540 And then we're just humans and we're in this, this space now where there's like one generation
01:15:07.520 to the next, where one generation was completely grown up online and one hasn't really.
01:15:11.720 Um, so I wonder, and I think the older generation probably sees it as like, man, this is so negative.
01:15:18.500 Everybody's stuck.
01:15:19.200 But I wonder if the younger generation, do they even know any different?
01:15:22.200 Right.
01:15:22.440 You know, like just, I start to wonder what is the value of being human?
01:15:27.100 Like, like what is the, does that start to dissipate as we become more technologically
01:15:32.680 advanced or does that alter?
01:15:35.400 Like, do you think about that sometimes?
01:15:37.980 Yeah.
01:15:38.420 I mean, look, I think it's going to give people the freedom to focus more on the things that
01:15:42.900 they want.
01:15:43.360 I think that if you look at the arc of, of history, go back, like, I mean, before, you
01:15:49.020 know, in ancient times, life was like pretty brutal, right?
01:15:51.320 It's like, okay.
01:15:52.620 So then, then you go to like, maybe right before the industrial revolution where they
01:15:57.280 had team toilet paper at one time, they had like group toilet paper and it was like, yeah,
01:16:01.080 that's, I mean, that's not good.
01:16:02.660 That's not good.
01:16:03.440 Um, yeah, I don't think, I mean, I don't, you don't want to, yeah, no, it's, I think anyone
01:16:08.220 who has nostalgia for the past really is not taking into account the disease and the lack
01:16:13.160 of hygiene that existed in the past.
01:16:14.840 So I think that there's, so anyway, but pre-industrial revolution, I think some huge percent of the
01:16:24.160 population was farmers because basically everyone needed to be focused on growing things in order
01:16:32.480 to have enough food.
01:16:33.360 Then we basically got to the point where, okay, now we can start to produce food much more
01:16:39.160 efficiently. So that actually frees up a lot of people to not have to be farmers. Now, maybe like
01:16:45.780 2% of people can be farmers and 98% of people can do other stuff. So now we, people start doing more
01:16:52.320 kind of other creative stuff and inventing new things. And, um, and at each step along the way,
01:16:57.520 uh, kind of like being a farmer is a really hard job, right? It's like, you're working like a lot of
01:17:03.500 hours. And, and, and, and, and now as, as kind of people got more options, they, they took other
01:17:10.360 jobs. Um, but then we've also seen this mix where the percent of people's time that has gone towards
01:17:16.860 leisure and entertainment has just steadily increased over time. Um, and I just think that
01:17:22.920 that's going to continue to be the case with technology. We'll have more stuff that will make
01:17:28.140 it so that the basic needs, um, are taken care of, which will free people up to do some combination
01:17:34.100 of more creative jobs and not have to work as hard if they don't want to. But I think some people are
01:17:41.860 going to like working all the time like me and, um, and you'll be able to do that too and get more
01:17:47.740 done than you could have ever possibly done in the past. Uh, but that'll be sort of a choice. And
01:17:52.340 well, since you're kind of like a leader in innovation and technology in our world, you know,
01:17:58.260 um, do you, how do you know that what your convictions are? How do you gauge if what your
01:18:05.260 convictions are, are the best for everybody? Kind of like, how do you kind of figure that out? You
01:18:09.880 know, it seems like such a challenge. Yeah. Well, does that make sense of the question or no?
01:18:14.540 Yeah, no, I, I think I get what you're asking. Um, I mean, look at the end of the day,
01:18:19.340 um, there are still a lot of options of things that people can do just because I build something
01:18:25.480 doesn't mean that people are going to use it. Actually, a lot of the things that I build,
01:18:28.580 like some, some of them work, some of them don't. And like, I think part of the reason why
01:18:33.500 the company has been successful is because, you know, maybe we have a slightly higher hit rate
01:18:39.640 of things working than others, but it's kind of like, I don't know, in baseball, it's like most people
01:18:45.180 don't get on base most of the time. Right. It's so it's, it's like, like running one of these
01:18:50.460 companies, you, you more of the stuff doesn't work than does. And if we do something that doesn't
01:18:58.920 work, then in general, people aren't going to use it. And then the future doesn't go in that
01:19:04.220 direction. So I say, so you're saying it's up to the user more. Yeah. I mean, look, I, I kind of
01:19:09.140 think, um, one way to look at the world is that there's a version of history that says that like
01:19:20.960 individual people are very powerful and have a lot of kind of autonomy and ability to, to kind of go
01:19:26.580 in the direction that they think is right. And then there's like all these other narratives where
01:19:30.360 people try to kind of diminish people's autonomy and authority. And I'm just like, I've always been
01:19:37.240 a person who really kind of believes that people understand people are smarter than people think.
01:19:44.460 Um, yes. And, and I think in general, um, are able to make good decisions for their lives.
01:19:51.260 And when they do things that like the media or whatever thinks don't make sense, it's generally
01:19:55.940 because the media doesn't understand their life, not because the people are stupid. Um, like if people
01:20:00.360 are saying something that seems wrong, it's not usually misinformation. It's usually that you don't
01:20:04.520 understand what's going on in that person's life. And I just think that there's like a certain
01:20:09.620 kind of paternalism in, in some of the like mainstream narratives and some of the media
01:20:15.300 narratives, but like a know-it-all ism almost. Yeah. It's like, there has been for sure for years.
01:20:20.320 I think it's starting to change more. Yeah. I think it's a little more receptive as,
01:20:24.240 as maybe some of those cultural or media elite people like are having a harder time predicting what's
01:20:30.140 going to happen in the world. Maybe there's a little more humility of like, okay, maybe we don't
01:20:33.600 understand all of this, but, but to me, the best predicting thing has always been like,
01:20:40.440 all right, if you build something, do people actually think it's good? Because like at some
01:20:45.340 level, you know, it's like, I just believe that people are actually very smart and understand
01:20:48.600 their lives very well. And if you're building something that is useful for them, then they
01:20:52.340 will use it. And if you're using something, if you're building something that is not useful
01:20:55.880 for them, then they have other options. They will, they will do something else. Um, and so I don't
01:21:02.600 know, it's always served me well to generally have faith in people and believe that people
01:21:09.560 are smart and can make good decisions for themselves. And whenever we try to like adopt
01:21:14.820 some sort of like attitude of, Oh, we must know better than them. It's like, we're like,
01:21:19.880 we're the people building technology. That's when you lose. Right. And if you do, and if you
01:21:23.740 have that attitude for long enough, then you just like become a shitty company and you lose
01:21:28.720 and you lose and you lose and then you're irrelevant. So, um, so I, I, I tend to just
01:21:34.680 think that at the end of the day, yeah, I mean, I think people are smarter than, than a lot
01:21:38.780 of people think. And I think ultimately drive the direction that society goes in.
01:21:42.180 You, um, so like people, a lot of times, like there's guys who are like kind of, you know,
01:21:50.400 Elon Musk is probably like a socially awkward guy. And I would say that, I mean, I think
01:21:54.060 it's, yeah, we all are right. We all are right. I think we all are. And it's interesting that
01:22:00.560 there's like probably people have, I mean, have you ever felt socially awkward over your
01:22:04.240 years? No, I'm, I'm, I'm really smooth. Obviously. Yes. Yeah. I'm like the most awkward person.
01:22:11.680 People have been calling me a robot online for 20 years. It does wonders for my confidence.
01:22:18.400 No, your confidence cannot be impaled. I don't think that's one thing you have. That's probably
01:22:22.480 a sheer North star inside of you. It's gotta be, you might've become bulletproof. I think there's
01:22:28.460 times where, yeah, you seem like a guy who probably like, like watched a video of how to be a guy on
01:22:34.020 YouTube or something, you know, but I think we all, we all go through, like, we're all like
01:22:38.720 awkward in different ways. You know, you put you in certain environments and you're not at all.
01:22:42.400 I, but I think it's interesting that there's, I haven't found those environments yet, but maybe,
01:22:46.380 you know, it's, uh, even being here today, bro, is nice of you. It's nice of you to be here today.
01:22:52.140 No, I, I think the podcasts are awesome. Cause you're just like, you get to explore something.
01:22:56.180 Right. For sure. But here's my question. I'm sorry, Mark. I didn't mean to interrupt you,
01:22:59.280 but I'm not going to get another chance to. So my question is, is it interesting
01:23:04.000 that they have kind of people who have would probably self-describe as socially awkward
01:23:08.680 at times, kind of creating technology that socially connects people? That's the thing
01:23:15.580 that you ever kind of find that kind of fascinating. Cause I've always had a belief that, that, that
01:23:21.820 like sometimes socially awkward people are almost a mix between human and like machine, like,
01:23:28.060 like the future or something that make any sense. Um, yeah, you know, I, it's an interesting
01:23:35.760 question. I, I just think that there are a bunch of factors here that you need to peel apart. Um,
01:23:43.740 I think someone can be socially perceptive and understand kind of what is going on in social
01:23:52.880 dynamics and have a lot of empathy and care about other people while still being quite awkward in
01:24:00.080 how they communicate. And, um, and so I, I don't think you can build a great, yeah, I'm trying to
01:24:08.900 build too broad of a bridge. No, well, I think it's a fair point, right? I mean, it's like, all right,
01:24:14.700 a lot of social media is like people creating great content and, um, and kind of communicating
01:24:22.320 really well. And those are not my biggest strengths, right? It's like, I'm, I'm, I don't
01:24:26.560 think I'm the best communicator by a long shot. I mean, I think I kind of got to where I am because
01:24:30.900 I, I think I kind of understand what people like and I have the ability to build it, but I don't think
01:24:38.520 my strength is like, Oh, I can really like communicate about why what I'm building is awesome. I generally
01:24:44.560 like to make it so that my work speaks for itself. And I try to explain it so I can kind of explain
01:24:50.700 how I'm thinking about it, but, but I don't think people primarily like using our stuff because,
01:24:56.100 you know, they saw me talk about it and they're like, Oh yeah, this seems super exciting now.
01:25:00.720 Um, but, but I think the ability to kind of communicate in a way that is not awkward is a
01:25:07.180 different skill, um, than the ability to kind of understand and have empathy for, for kind of
01:25:14.380 people and, and, and social interactions. And it's, you know, there's an interesting
01:25:19.380 thing where I actually think sometimes a lot of the people who can communicate in the smoothest way
01:25:26.100 sometimes have a lot less empathy and understanding of social dynamics than the
01:25:31.920 kind of nerdy guy who may not be able to express himself quite as well, but sort of understands a
01:25:39.540 little bit better what's going on. Um, and I don't know, the world's complicated and there's like
01:25:45.900 multiple dimensions to all this stuff and no one's good at all of them. So you just, you know,
01:25:51.000 try to do the things that you're like, hone the things that you're good at and try to put it to
01:25:55.160 service to, you know, do as good of work as you can.
01:25:58.020 Yeah. I think we're in this unique place where I believe it's like one or two generations think
01:26:02.640 that one, something isn't social. And then the younger generation thinks that it is social.
01:26:07.840 Yeah. And so I feel like in some senses, we're at this crossroads kind of of like how we
01:26:13.160 communicate as humans a little bit. Um, and this advancement and sometimes those steps are, uh,
01:26:18.600 are kind of tricky to take. Um, this is my last question. Cause I know you have to go. Um,
01:26:23.140 so I feel like Elon Musk has like a, like he wants to get on Mars and he wants to like impregnate
01:26:28.000 planets or whatever, you know, or whatever he's doing, dude, he's just blasting his seat out into
01:26:32.800 the different rockets, whatever, you know, he's just out there, you know, he's like the Johnny
01:26:36.740 Apple solar system, you know, but God bless, but no, God bless a hundred percent dude. And yeah,
01:26:42.860 a hundred percent. Um, and I'm just joking. He, uh, I think he probably would know that, but,
01:26:48.460 um, I just think it's interesting. Like you get his, you get what his ideas are. He wants to like,
01:26:53.900 we want to be on Mars and we want to send the rockets and we want to make everything solar
01:26:57.540 powered and stuff. And you're such an innovator and a leader. Like what is, what do you feel
01:27:02.000 like is your kind of, it's not a goal, but kind of where do you, what do you, what draw,
01:27:07.640 like, what is the thing where you see us, you know, like where you're at a point where like,
01:27:11.200 man, this is what I'm really proud of. And this is where I see us going because you know,
01:27:16.700 we're all on this bus together going somewhere and we don't exactly know where we're going
01:27:20.760 because it's the future, but you're kind of, you're kind of one of the guys driving the bus,
01:27:26.240 you know, or at least riding shotgun. So it's like, where are we going?
01:27:30.740 I mean, I think different people just care about, about, uh, different parts of, of, uh,
01:27:36.540 the future. Right. So the space thing, I think it's cool. I'm glad that people are working on it.
01:27:40.480 It's never really been my, my main thing. Um, for me, it's all, it's, it's kind of been about
01:27:46.460 the intersection of, of how do you build technology that helps people connect with each other and
01:27:53.460 understand the world better and, um, and just taking different formats over time. Right. So
01:27:59.280 when we got started, people, you know, are mostly like, you know, writing text. Then, you know,
01:28:04.740 then we got smartphones and, um, we got cameras with the smartphones. We started taking a lot of photos
01:28:10.700 and sharing that. And now, now most of what we do is video, right? The mobile networks are,
01:28:15.820 are good enough that you can like share great video. It used to be great. Yeah. Like 10 years
01:28:19.800 ago, you try watching a video. It's like buffering, buffering. It's like, okay, this is, this is
01:28:23.660 terrible, but now, now it's good. Um, people always want to both kind of express their ideas and
01:28:33.260 experience other people's ideas in whatever the richest format is that they can. So if that was
01:28:38.360 going to be text, then that was the best that they had. Great photos, visual, great. A picture's
01:28:43.620 worth a thousand words video better than photo for most things. But I don't think that's the end of
01:28:48.320 the line. Right. I just think, um, we're going to be here whether it's five years or 10 years. And
01:28:54.820 I think the ability to like fully capture moments, um, to really be able to experience them. I think
01:29:01.740 that's sort of the hologram thing that we're talking about. Um, I think that that's just going to be
01:29:06.580 like the next level of, um, of people being able to express like ideas and moments in their life.
01:29:15.580 Yeah. Yeah. Yeah. Like the next like blank canvas. So that's one thing. And then there's the whole AI
01:29:19.480 thing, which we haven't spent as much time on, but we should, we should, um, which that is really going
01:29:27.620 to, um, give people a lot of new tools to, to kind of, um, both just get smarter at everything they do.
01:29:36.140 And, um, and if you look at the world, like I do through this lens of how do we express ourselves
01:29:42.280 and how do we kind of take in an understanding of what's going on? AI is just going to be super
01:29:48.140 powerful for, for both of those. It's, um, I mean, you can already see some of the basic stuff with
01:29:53.400 like people creating images or editing images. I mean, it's crazy how fast it's happening too.
01:29:57.820 That's what I'm amazed at is just how fast it's happening. Yeah. Yeah. But do you think like,
01:30:01.280 do you envision this? Like, are you just like, like, what do you envision that will be on these
01:30:07.400 like surf? Do you have like a, like kind of like a utopian idea or do like, I just wonder how do you
01:30:13.680 see things? Cause you're the only, like, you're probably one of the smartest people to ever use
01:30:19.000 thought. I don't know. I don't know. Well that we, you know, are one of the most unique people
01:30:24.280 to ever use thought that we have in our time. So it's like, how do you, what is it? What is it
01:30:30.260 like out there? I mean, I think that we're going to get general intelligence. Um, we're going to have
01:30:37.760 systems that are smarter than, um, than any individual. And I think it's mostly going to be
01:30:42.860 very empowering for people for, first of all, look, I mean, there are already systems that are smarter
01:30:47.540 than any one individual today. Right. If you take like a company, right. It's like, okay,
01:30:51.000 you got like a thousand people or 10,000 people who are all kind of like working towards, you know,
01:30:55.740 ostensibly working towards a goal together. Like I, you know, if the intelligence of a 10,000 person
01:31:02.240 company is not greater than the intelligence of a single person, then like, what are we doing here?
01:31:06.320 Right. So there are already these systems in the world that have this sort of super intelligence that
01:31:12.760 far exceeds what any one person can do. And I just think like, instead of having relatively few
01:31:20.460 people be able to kind of harness the power of like, you have the ability of, you know, you have
01:31:25.700 a 10,000 person organization that can, um, that can help you build the things that you think are good.
01:31:30.980 I just think in the future, almost everyone is going to have that. And that's cool. What does that
01:31:35.000 mean? It means that more ideas are going to get tried out. Um, so it might be up like leveling up in an
01:31:39.640 overall idea of creativity in the universe with AI. Yeah. And, and I think it's going to be every
01:31:44.440 field, right? So like science will get more advanced, um, like we'll get more productive,
01:31:49.740 but like, I think a big part of the internet is stuff just gets more fun and funnier and like the
01:31:55.820 memes get weirder and more specific. And like that is advancement too. It's almost like universal
01:32:01.540 basic technology kind of in a way. Yeah. It's just the, the ability to, to kind of express these like
01:32:07.300 very complicated ideas in like a very simple piece of media. I think we're going to get better
01:32:13.660 and better at that. And that advances our kind of understanding of ourselves as a society. So,
01:32:18.480 um, yeah, I don't know. I think we'll get super intelligence and I would guess that it will be,
01:32:24.200 um, a continuation of this trend that humanity has been on for a hundred plus years of basically
01:32:31.760 getting more time to do creative things, less time having to do drudgery,
01:32:36.000 not having to spend as much time working if you don't want, but if you want to, uh, dedicate your
01:32:41.340 life towards that, you're going to have more powerful tools than you could ever have possibly
01:32:45.240 imagined. So it's not as much a conviction, as much as it is a space of choice is how you see
01:32:50.040 that. Like just kind of that sort of thing. What do you mean? It's not like you're like,
01:32:53.000 like convicted to this sort of like future as we advance. Like it's just a space of choice.
01:32:57.860 If you still want to be able to do these things, you can do those. And if you want to be able to do
01:33:00.660 these things, you can do those. Yeah. There you are on the bus right there. There you go. I mean,
01:33:04.580 that's a little weird of a photo, but you know, I think that is good. I do not think that is good.
01:33:09.340 Um, I don't like that at all. And we, thank you for telling us that though. It's horrible AI.
01:33:16.000 Yeah. I think that was open AI dude. So sorry about that. Um, do you think you,
01:33:20.000 do you think you will live forever, Mark? Is that a thought coming to your head?
01:33:23.220 Yeah. Um, that is an interesting question. I think you have two minutes to answer the info.
01:33:29.680 Yeah. No, I, I don't know. Live forever. Gosh, I think at some point we will like a lot of,
01:33:38.100 well, let me come at it this way. Our, um, outside of meta, um, my, the philanthropy that I do with,
01:33:46.440 with Priscilla, the Chan Zuckerberg initiative is primarily focused on curing diseases. And the way
01:33:52.420 we're doing that is not by focusing on any single disease. It's by focused on kind of basic
01:33:56.760 technology at the intersection of AI and biology to accelerate the pace of science. And we originally
01:34:03.420 thought that kind of a hundred years from when we started was sort of around the timeframe to be
01:34:11.540 able to cure and prevent and like deal with all diseases. I know there's a chance that that happens
01:34:18.540 sooner than, um, you know, because of, because of all the work with, with AI, I guess I'm a more
01:34:22.920 optimistic about that now. Um, whether that means that like we are going to live forever or we just
01:34:29.020 have healthier lives for the period that we're supposed to be living. And then at some point,
01:34:33.500 like your human body is done. Um, I don't, I don't think we understand that yet enough, but, um,
01:34:40.560 but a lot of curing diseases is not just about living forever. It's about having better lives.
01:34:48.060 And, and it's like, and it's like, while you're alive, like you don't want to have to
01:34:51.940 deal with shit. Right. It's like, like where you're just like, you feel terrible because you're sick or
01:34:57.760 like you're injured or there's, you know, so I think it's possible. Like you, when you, because you have
01:35:02.360 an understanding of like science and being human, that is, that is way supersedes a lot of people's.
01:35:08.060 Do you, um, do you think it's possible that we could figure that out?
01:35:13.060 Curing all diseases? Uh, to build a, like live, to keep life. Like,
01:35:17.000 do you think it's possible that we could figure out how to do it?
01:35:19.220 I think it's possible. Um, I don't know. It's honestly, it's not an area that I've
01:35:22.720 studied that much and it's, um, it's no, cause we don't know, we don't know what's going on,
01:35:26.660 Mark. And we just want you to tell us. Yeah. I mean, no, I don't know. I don't know. I let's,
01:35:31.620 let's check back in, in 10 years on that. I think there, as, as the, the AI stuff makes more
01:35:36.520 progress, I think we'll, we'll kind of get a sense of the trajectory for that. But,
01:35:40.700 but I think it's just going to unlock a lot of creativity and productivity and fun. And like,
01:35:46.120 I think people, the technology industry misses fun a lot. I think that that's like one observation
01:35:51.300 that I've had building stuff out here is, um, people are very focused on like, all right,
01:35:56.560 we're going to make like a better word processor. We're going to like process information better.
01:36:00.960 Cool. Um, but I know a lot of what people care about is just like, all right, I want to like
01:36:07.640 be entertained. I want like, I want my life to be fun. Right. I want, I want something that's funny
01:36:12.760 that I can then go show to my friends and then we can talk about that and like, and then just hang
01:36:18.200 out and have a good time, like showing each other funny things and like talking about what the world
01:36:22.200 is. And I think AI is going to make this stuff all great. So, um, I don't know. I mean, it's,
01:36:29.160 I do think this is, this is a big, a big focus for us and we're building this meta AI. Um, it's our,
01:36:35.640 we call it personal AI. I mean, our goal on this is not just to build something that's like,
01:36:42.320 I mean, yeah, it's going to be super smart, right? It's, it's like, yeah, we're trying to solve full
01:36:46.360 general intelligence and super intelligence and all that. But I think in order to build the product
01:36:50.700 that people are going to want to use, um, you're going to want to build something that's fun to
01:36:55.180 use. And that means like, you're not just going to want to like type to it. You want to like have
01:36:58.780 a conversation. Right. And it, and it's not just about having it be like only able to answer hard
01:37:05.240 questions. It should like get to know you and like what you think is funny and like what you ate.
01:37:09.780 Right. So that way, you know, it can like, or, you know, what your hobbies are. So that way it can
01:37:13.440 kind of relate to you. And, um, I think people don't know that AI can do that. I think that we need,
01:37:17.780 we're missing quickly. I think an education, educating people, how to, what AI is and how
01:37:24.000 to use it. I think I noticed even in my own life and I spent a lot of time online involved in stuff,
01:37:29.340 but I think people are not understanding what's going on. So I don't know how we get people
01:37:33.580 educated quickly so that they can, I think whoever can also serve people the best way to educate
01:37:38.220 themselves is going to be able to best exert, um, or, um, be able to best, uh, coagulate people to
01:37:47.220 their AI, um, model or whatever. I've never heard coagulate used that way. Or yeah, or something
01:37:51.900 like that. Yeah. But because that's the big thing, a lot of people are like, we have the best AI,
01:37:56.340 but most people are like, what the hell is, what are we doing? You know what I'm saying?
01:37:59.240 Yeah. So anyway, sorry. I think I kept you longer than we were supposed to, but, um,
01:38:04.580 yeah, sorry. We got into a lot of technological stuff. I think it's just kind of like, you know,
01:38:08.820 we don't know what's happening. And sometimes we want to talk, we get to talk to somebody like you
01:38:11.820 and it's important, you know? Yeah, no, it's good. This is a fun conversation.
01:38:14.960 Yeah, man. I enjoyed it too, man. Um, yeah. Thanks for sharing just some of what your life
01:38:19.920 is like with us. And, um, just, I hope that, yeah, just keep, make sure we stay alive or
01:38:25.660 whatever we're supposed to and just keep taking, working on it. Okay. That's all we can ask for
01:38:30.480 now, sir. Awesome. Mark. Thanks to you, man. Appreciate it, brother. Yeah. Yeah.
01:38:34.160 Now I'm just floating on the breeze and I feel I'm falling like these leaves. I must be
01:38:41.200 cornerstone. Oh, but when I reach that ground, I'll share this peace of mind. I found I can feel it
01:38:52.180 I also want to say a thank you to the people at Grace Dental. That's Grace Dental in Palo Alto,
01:39:07.620 California. They, um, glued, my tooth was broken and they glued it back together, uh, before the
01:39:13.580 interview, uh, today. So I'm thankful that they helped and I'm grateful that I got to meet them.
01:39:19.880 Thanks.