The Joe Rogan Experience - July 02, 2025


Joe Rogan Experience #2344 - Amjad Masad


Episode Stats

Length

2 hours and 51 minutes

Words per Minute

159.56924

Word Count

27,414

Sentence Count

2,463

Misogynist Sentences

11


Summary

In this episode of the Joe Rogan Experience, we talk about video games and their impact on our brains, and whether or not they should be required to be played in every facet of our lives. We also talk about how video games should be taught to surgeons.


Transcript

00:00:01.000 Joe Rogan podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Train by day, Joe Rogan podcast by night, all day!
00:00:13.000 What's up, man?
00:00:14.000 Good.
00:00:15.000 So having this big Counter-Strike tournament in town, does that give you the Joneses?
00:00:19.000 Totally, totally.
00:00:20.000 You know, it's like, so your guy, Jason, was telling me about it because, you know, in addition to driving, he also flies the helicopter.
00:00:31.000 And he told me, like, the Red Bull guys were flying off, and there's like this big tournament.
00:00:35.000 I looked it up.
00:00:36.000 It was like, oh, Counter-Strike.
00:00:37.000 So I used to be a bit of a pro player myself.
00:00:40.000 So how do you get out of pro playing?
00:00:42.000 Because the problem with playing games is that it's essentially like an eight-hour a day thing.
00:00:48.000 Like it becomes a giant chunk of your life, right?
00:00:50.000 And I would imagine if you're playing pro, it's even more of a commitment.
00:00:54.000 You know, I take a different view on games.
00:00:58.000 You know, a lot of people kind of view it as a sort of somehow like a negative thing, especially for kids.
00:01:04.000 Actually, I got my kit, my four-year-old, like a Nintendo Switch early on.
00:01:09.000 We're playing together because I feel like for me, it helped me a lot with strategy thinking, with reaction time.
00:01:17.000 I think gamers tend to think really fast.
00:01:20.000 Have you seen the studies that they've done about surgeons?
00:01:23.000 No, tell me.
00:01:24.000 Surgeons that play video games regularly are much less likely to make mistakes.
00:01:29.000 I totally believe it.
00:01:31.000 Something in the neighborhood of 25%.
00:01:33.000 Is that what it is, Jamie?
00:01:34.000 Something like that?
00:01:35.000 But so much so that I would say you should teach video games to surgeons.
00:01:40.000 It should actually be a required thing, like cross-training.
00:01:44.000 Right.
00:01:44.000 Isn't the Army also recruiting from gamers today as well?
00:01:46.000 That's what I heard.
00:01:47.000 I would imagine like drone pilots.
00:01:48.000 Right.
00:01:49.000 Right.
00:01:49.000 I mean, that would make a big difference.
00:01:51.000 Especially if you can get them used to the same controllers.
00:01:53.000 Totally.
00:01:54.000 You know, because those controllers kind of become a part of your hand.
00:01:57.000 Like, you know exactly where all the buttons are.
00:01:59.000 Right.
00:02:00.000 If you're a kid that's playing fucking Counter-Strike or whatever it is, Call of Duty every day, I would imagine that that just becomes nature.
00:02:08.000 Yeah, yeah.
00:02:10.000 What is the thing with surgeons?
00:02:11.000 It's nuts, right?
00:02:13.000 It might be higher than 25%.
00:02:15.000 It was a very particular kind of surgery, though, too, but it was like, I mean, they're almost using controllers, so fine.
00:02:21.000 Yeah, but that they were making less mistakes.
00:02:24.000 I don't think it's entirely negative.
00:02:26.000 Because I love games.
00:02:28.000 I love playing them, but I love them so much that I don't play them because I know I don't have any time.
00:02:33.000 Quake is your favorite game, right?
00:02:35.000 Yeah.
00:02:36.000 So you're here to 27, 37% decrease in errors.
00:02:39.000 That's wild.
00:02:40.000 27% faster task completion time.
00:02:42.000 That's nuts.
00:02:44.000 So those guys grew up playing video game or did they say more than three hours per leap.
00:02:50.000 I think they were still playing when they were doing the study.
00:02:53.000 Yeah.
00:02:53.000 So like that, I mean, imagine something that, like a pill you could take that would give you a 37% decrease in errors and a 27% faster task completion.
00:03:05.000 That would be an incredible pill.
00:03:06.000 Like you would make every surgeon take it.
00:03:08.000 Did you take your video game pill before you do surgery?
00:03:12.000 Hey, man, don't operate on my fucking brain unless you take your video game pill.
00:03:15.000 You know, that's, you know, next time I need to have a surgery or whatever, I'm just going to ask the doctor.
00:03:19.000 Is there a game?
00:03:20.000 How much do you game, bro?
00:03:23.000 But Jamie and I were talking about the one thing, and maybe that's kind of showing our age a little bit, but the one thing that's kind of like a little weird slash, I don't know somehow, like a little dystopian is the whole streaming situation where like kids are not like playing the game, they're like watching someone play the game.
00:03:40.000 And it's like this zombifying thing where like they'll they'll spend hours just watching people.
00:03:40.000 Yeah, that's not good.
00:03:46.000 Yeah, this TikToking, it's essentially like TikTok, but video games, right?
00:03:50.000 Because TikTok is kind of this mindless thing.
00:03:52.000 You're just scrolling through mindless things and now you're mindlessly watching someone else play a game.
00:03:57.000 Yeah, it's almost like someone is like there's this strange thing with technology where like someone is living life and doing things and you're like sort of it's almost voyeurism or something like that about it.
00:03:57.000 Yeah.
00:04:11.000 You know, David Foster Wallace, you know, the guy from Infinite Jest, wrote an essay on TVs.
00:04:20.000 And, you know, he committed suicide before like, you know, the emergence of mobile phones and things like that.
00:04:27.000 But he was very prescient on the impact of technology on society and especially on America.
00:04:34.000 And he was also addicted to TV.
00:04:36.000 And he talked about how it activates some kind of something in us that is something in human nature about voyeurism.
00:04:46.000 And that's the thing that television and TikTok and things like that activate.
00:04:50.000 And it's like this negative, addictive kind of behavior that's really bad for society.
00:04:57.000 I definitely think there's an aspect of voyeurism, but there's just a dull drone of attention draw.
00:05:04.000 There's a dullness to it that just like sucks you in like slack jawed.
00:05:09.000 It is watching nonsense over and over and over again that does just enough to captivate your attention, but doesn't excite you, doesn't stimulate you, doesn't necessarily inspire you to do anything that is the first fly we've ever had in this room.
00:05:23.000 Boom.
00:05:24.000 Oh, I'm going to kill it.
00:05:26.000 You're a nice person.
00:05:28.000 You don't be able to kill that fly right away.
00:05:30.000 But it's just this thing where it doesn't do a lot.
00:05:36.000 It's not like, you know, like, have you ever done Disney World?
00:05:41.000 Yeah.
00:05:41.000 Did you ever do Disney World in Florida where you do that giraffe?
00:05:44.000 There's the Avatar ride?
00:05:46.000 No, I just went to a California one.
00:05:48.000 Okay.
00:05:49.000 The Avatar ride is Flights of Freedom?
00:05:52.000 Flights of Passage.
00:05:53.000 Fights of Passage.
00:05:55.000 It's a VR game.
00:05:57.000 Well, a ride, rather.
00:06:00.000 And you put on a VR helmet and you get on this motorcycle looking thing.
00:06:03.000 You're essentially riding a dragon.
00:06:05.000 It's unbelievably engaging.
00:06:07.000 It's incredible.
00:06:08.000 It's the best ride I've ever been on in my life.
00:06:10.000 Like you're flying around, you feel the breeze, you're on this thing and the sounds are incredible.
00:06:10.000 That's cool.
00:06:15.000 That's like engrossing, right?
00:06:18.000 It takes over you.
00:06:19.000 Stimulating, but that's not what you're getting from like TikTok or like streaming.
00:06:24.000 You're getting this, oh, this dull, so it's sustainable.
00:06:29.000 Yeah, I wonder which is worse, this or like opium habit or something.
00:06:34.000 I know people that have done opium that are like functional.
00:06:37.000 Yeah.
00:06:38.000 You know, they can take pills and like kind of, I mean, I'm sure eventually their life falls off the rails, but it's like sort of semi-they're semi-functional when they're on these things.
00:06:48.000 They can hold down a job and show up every day.
00:06:52.000 And they're just like semi-functional opiate.
00:06:56.000 There's a dude, I watched like a YouTube video, but like he's known for having this contrarian opinion on drugs that you can like control it, like you can, you can do these drugs.
00:07:06.000 What does he look like?
00:07:07.000 I don't know.
00:07:09.000 I think he's a black dude.
00:07:11.000 Oh, Carl Hart.
00:07:12.000 Dr. Carl Hart.
00:07:13.000 He was here?
00:07:14.000 Yeah, yeah.
00:07:14.000 He's been here a couple times.
00:07:15.000 He's great.
00:07:15.000 What do you think of his ideas?
00:07:17.000 I think it's entirely biologically variable.
00:07:22.000 I know people that cannot drink.
00:07:25.000 They drink and then they're gone.
00:07:26.000 They get hamsterized, get these black eyes where their soul goes away and then they're just off to the races and picking up hookers and doing cocaine and they find themselves in Guatemala.
00:07:38.000 They're just nuts.
00:07:39.000 They can't drink.
00:07:40.000 I can drink.
00:07:42.000 I don't pretend that the way my body handles alcohol is the way everybody's body handles alcohol.
00:07:49.000 I think that's the same with everything.
00:07:51.000 I think that's the same, most certainly with marijuana.
00:07:54.000 I know some people that just cannot smoke marijuana and other people, it's fine.
00:07:59.000 I think it's very, we're all very different physically.
00:08:03.000 It's interesting.
00:08:04.000 Alcohol is sort of on the downtrend all of America, but especially with young people, especially in Silicon Valley.
00:08:14.000 Everyone there listens to Huberman.
00:08:19.000 I call him the grand mufti of Silicon Valley because he'll say, no alcohol, no drinking.
00:08:23.000 Everyone's like, don't drink.
00:08:25.000 And all the parties are now mocktails and things like that.
00:08:29.000 There are probably a lot of boring conversations, unfortunately.
00:08:31.000 It's a little boring.
00:08:32.000 I mean, it's very repetitive.
00:08:34.000 It's all kind of like, will AI kill us?
00:08:38.000 You guys would know better than anybody.
00:08:40.000 You guys are at the forefront of it, unfortunately.
00:08:44.000 Yeah, I quit drinking.
00:08:45.000 I quit drinking over three months ago.
00:08:48.000 I know you guys used to do Sober October.
00:08:48.000 Oh, wow.
00:08:51.000 Yeah.
00:08:52.000 And that wasn't that hard.
00:08:53.000 And, you know, I was like, God, it's going to be one whole month.
00:08:56.000 And then I did.
00:08:57.000 I was like, that's pretty easy.
00:08:58.000 But I just had some revelations, I guess.
00:09:02.000 And I think the big one is just physical fitness.
00:09:05.000 I work out so much and I would drink and go to my club and have a couple of, not a lot either.
00:09:12.000 Just have a few drinks and the next day just feel like total shit.
00:09:16.000 I think with age especially, it starts affecting you.
00:09:19.000 It's always been like that.
00:09:20.000 It's always been.
00:09:21.000 It's always been like that.
00:09:22.000 I've always been hungover after a night of drinking, but you don't feel it normally.
00:09:27.000 Like in normal life, if I just did normal stuff, it'd be fine.
00:09:30.000 It's when you're in the gym that you notice.
00:09:32.000 When you're doing like second and third set of squats or something like that, you're like, oh, God.
00:09:32.000 Right.
00:09:38.000 Yeah, 100%.
00:09:40.000 And I haven't had any bad days since I quit drinking.
00:09:43.000 I've eliminated all that.
00:09:43.000 Oh, cool.
00:09:45.000 And I'm like, just that alone is worth it.
00:09:49.000 Just that alone, it's worth quitting.
00:09:51.000 So why do you think there's this trend?
00:09:54.000 Is it mostly for health?
00:09:57.000 Well, I think there's a big health trend with a lot of young people.
00:10:00.000 I think a lot of young people are recognizing the value of supplements.
00:10:04.000 There's that fly.
00:10:05.000 There's a difference between you and me.
00:10:06.000 I'm going to kill this motherfucker.
00:10:08.000 First fly I've ever had in here, Jamie.
00:10:09.000 That's kind of crazy.
00:10:11.000 Been here five years.
00:10:12.000 Father with me from California.
00:10:12.000 One flyer.
00:10:14.000 He snuck in because there's a lot of steps that motherfucker has to go through to get into this room.
00:10:19.000 I think a lot of people are very health conscious.
00:10:21.000 That's the rise of cold plunging and sauna use and all these different things like intermittent fasting where people are really paying attention to their body and really paying attention and noticing that if you do follow these steps, it really does make a significant difference in the way you feel.
00:10:38.000 And maybe more importantly, the way everything operates, not just your body, but your brain.
00:10:44.000 It's like your function, your cognitive function improves with physical fitness.
00:10:50.000 And, you know, if you're an ambitious person and you want to do well in life, you want your body to work well, you know, alcohol is not your friend.
00:10:59.000 And I wonder how much of it is your impact because those things, you got me into all these things through your podcast.
00:11:07.000 My wife and I just built like a small kind of spa in our home with like a cold plunge and a sauna and a hot tub.
00:11:17.000 And I'll try to do it every day.
00:11:19.000 And you know, something you say, I keep saying to myself, it's like, conquer your inner bitch.
00:11:24.000 Yeah.
00:11:25.000 It's like, this is such a good, and I feel like cold plunge especially kind of, it's just something, regardless, health benefits or not, something about it, like just mental toughness, like trying to do it every day.
00:11:40.000 And every day I chicken out.
00:11:41.000 Every day I want to go up.
00:11:42.000 I don't want to go in, right?
00:11:45.000 I do too.
00:11:46.000 My inner bitch speaks the loudest when I'm lifting the lid off the cold plunge.
00:11:51.000 My inner bitch is like, don't do this.
00:11:53.000 You don't have to do this.
00:11:54.000 You can do whatever you want.
00:11:56.000 You're a free man.
00:11:57.000 You can go have a sandwich, you know?
00:11:58.000 Right, right.
00:11:59.000 But you just got to decide that you're the boss.
00:12:02.000 And I think a lot of what discipline is for me is that, again, even keto and I did carnivore and these diets, like, I'm not sure how much health benefits there is.
00:12:02.000 Yeah.
00:12:16.000 I feel like keto is really good on your blood sugar and keeps you kind of on a, you know, even keel kind of throughout the day.
00:12:23.000 But for me, whenever there's like a lot of chaos in my life, I look at what can I control.
00:12:29.000 Right.
00:12:30.000 And typically diet is the first thing.
00:12:32.000 Whatever it is, I'm like, I'm going to go carnivore.
00:12:34.000 I'm going to go keto.
00:12:36.000 And the fact that I can control that and enforce discipline on myself kind of puts me at ease.
00:12:44.000 And I feel like I can control the other thing in my business, family, life.
00:12:49.000 But that mindset is probably how you stop playing video games every day.
00:12:52.000 Yeah.
00:12:52.000 Because I would imagine, like we were talking about earlier, like that addiction is one of the strongest addictions I've ever faced in my life.
00:13:01.000 Like when I was taught, if I would be talking to people and the conversation was boring, I'd be like, I could be playing Quake right now.
00:13:06.000 Why am I here having this boring ass conversation where I could be launching rockets at people and having a good time?
00:13:13.000 But the other thing for me is programming.
00:13:15.000 So I got into programming early in my life.
00:13:19.000 I was six years old when my father bought a computer.
00:13:24.000 I was born and raised in Amman, Jordan.
00:13:27.000 And we're the first people I know ever at the time that had a computer.
00:13:34.000 And I remember.
00:13:35.000 What year was this?
00:13:36.000 1993.
00:13:36.000 I was six years old.
00:13:38.000 Okay, so 93.
00:13:39.000 So what kind of computer was that?
00:13:41.000 Was that an old school IBM?
00:13:43.000 IBM PC, MS-DOS, Microsoft DOS.
00:13:47.000 Oh, so you did the real deal.
00:13:48.000 Yeah.
00:13:49.000 I know a lot of Americans would get a Mac as their first computer.
00:13:53.000 That's what I got.
00:13:54.000 Yeah, yeah, yeah.
00:13:55.000 No, we didn't have Mac.
00:13:56.000 I actually wasn't introduced to Apple until kind of recently in my life.
00:14:00.000 Really?
00:14:01.000 Like recently, recently?
00:14:01.000 Yeah.
00:14:02.000 Like, no, like, you know, 12 years ago, 13 years ago, when I moved to the U.S. God, Apple has such a stranglehold in America.
00:14:09.000 It's really incredible.
00:14:10.000 Yeah, it's amazing.
00:14:11.000 But we didn't know much about it.
00:14:13.000 So I got into DASA.
00:14:15.000 I remember one of my earliest memories is standing behind my father as he was kind of pulling up this huge manual and learning how to type commands.
00:14:26.000 And he was finger typing those commands.
00:14:29.000 And then I would watch him.
00:14:31.000 And then after he leaves, I'll go and try those things.
00:14:34.000 And one day he caught me.
00:14:35.000 I was like, what are you doing?
00:14:36.000 I'm like, I know how to do this.
00:14:38.000 I'll show you.
00:14:39.000 And so I knew how to start games, do a little bit of programming, do a little bit of scripting.
00:14:44.000 And that's how I got into computers.
00:14:47.000 And I was obsessed.
00:14:48.000 And initially, it sort of got me into gaming.
00:14:52.000 But then you want to mod the games.
00:14:55.000 Have you ever done any modding?
00:14:57.000 I've done a few things like turn textures off and stuff like that.
00:15:03.000 Yeah, and that's another thing that I think is healthy about gaming is like a gateway to programming.
00:15:07.000 Sure.
00:15:08.000 Gateway drug to programming.
00:15:09.000 And so I got into like modding, like Counter-Strike and things like that.
00:15:14.000 Those were fun.
00:15:15.000 And then just like the feeling that you can make something is just like such a profound, such a profound feeling.
00:15:22.000 And that's really kind of what I carried through my whole life and became sort of my life mission.
00:15:28.000 Now with my company, Replit, what we do is like we make it so that anyone can become a programmer.
00:15:35.000 You just talk to your phone and your app, sort of like ChatGPT, and it starts coding for you.
00:15:40.000 It's like a program software engineering agent.
00:15:43.000 Right.
00:15:44.000 So it's like the AI guides you through it.
00:15:47.000 Yeah, not only guides you through it, it codes for you.
00:15:50.000 So you're sort of, you know, programmers typically think about the idea a little bit, about the logic, but most of the time they're sort of wrangling the syntax and the IT of it all.
00:16:04.000 And I thought that was always additional complexity that doesn't necessarily have to be there.
00:16:10.000 And so when I saw GPT for the first time, I thought this could potentially transform programming and make it accessible to more and more people.
00:16:22.000 Because it really transformed my life.
00:16:24.000 The reason I'm in America is because I invented a piece of software.
00:16:28.000 And I thought if you make it available to more people, they can transform their lives.
00:16:34.000 Why was your dad messing around with computers?
00:16:36.000 Was he doing it for fun?
00:16:40.000 I want to let you in on something.
00:16:42.000 Your current wireless carrier does not want you to know about Visible because Visible is the ultimate wireless hack.
00:16:50.000 No confusing plans with surprise fees, no nonsense, just fast speeds, great coverage without the premium cost.
00:16:58.000 With Visible, you get one-line wireless with unlimited data powered by Verizon's network for $25 a month, taxes and fees included.
00:17:09.000 Seriously, $25 a month flat.
00:17:12.000 What you see is what you pay.
00:17:13.000 No hidden fees on top of that.
00:17:15.000 Ready to see?
00:17:16.000 Join now and unlock unlimited data for just $25 a month on the Visible plan.
00:17:21.000 Don't think wireless can be so transparent?
00:17:24.000 So Visible?
00:17:25.000 Well, now you know.
00:17:26.000 Switch today at visible.com slash Rogan.
00:17:30.000 Terms apply.
00:17:31.000 See visible.com for plan features and network management details.
00:17:35.000 Yeah, so my dad is a Palestinian refugee.
00:17:39.000 Yeah, you were telling me the story, and I want to get into that because it's kind of crazy.
00:17:43.000 Tell the whole story of how this wound up happening.
00:17:45.000 Yeah, yeah, yeah.
00:17:46.000 So my family is originally from Haifa, which is now in Israel, and they were expelled as part of the 1948 Nakba, where Palestinians were sort of kicked out.
00:18:00.000 And they went to like Fiji.
00:18:02.000 How does your dad describe that?
00:18:03.000 How old was he when that was going on?
00:18:05.000 My father was born in Syria.
00:18:07.000 So my grandma and my grandpa and my uncles were kind of kicked out.
00:18:15.000 And the way they would describe that is they try to fight, they try to keep their home, but it was like this overwhelming force.
00:18:24.000 They weren't organized.
00:18:25.000 They were just people.
00:18:28.000 They didn't really have an army, at least in that place.
00:18:31.000 And eventually at gunpoint, they took their homes and tell them to go.
00:18:37.000 If you're down south, you went to Gaza, and that's why 70% of Gazans are refugees from Israel.
00:18:43.000 Like the people that are getting massacred right now are originally from Israel, from the land that people call Israel today.
00:18:51.000 And then if you're in the north, like Haifa or Yafa, whatever, you went to Lebanon or to the West Bank or to Jordan or to Syria.
00:19:04.000 So my family went to Syria.
00:19:05.000 My father was born in Syria.
00:19:08.000 But my grandfather was like a railroad engineer.
00:19:14.000 So they were like city people.
00:19:17.000 They were urban.
00:19:18.000 So they couldn't like, you know, they wanted to have a place where they can, you know, they want to live in a city.
00:19:28.000 And so originally the West Bank Didn't work for them and they ended up in Syria.
00:19:33.000 But then Amman, Jordan, was kind of coming up, and there was a lot of opportunities there.
00:19:37.000 So my father was born in Syria and then moved to Amman when they were six years old and built the life there.
00:19:43.000 And they really kind of focused on education and trying to kind of rebuild their life from scratch.
00:19:49.000 So my father and all my uncles kind of went and got educated in Egypt, Turkey, places like that.
00:19:56.000 And so my father got an engineering degree, civil engineering degree from Turkey.
00:20:03.000 And he was always interested in technology.
00:20:06.000 That whole thing, we're kicking people out of Palestine, is such an inconvenient story today.
00:20:14.000 When people are talking about Israel and Palestine and the conflict, they do not like talking about what happened in 1948.
00:20:22.000 Yeah, and I think it's important.
00:20:24.000 I think for us to reach some kind of peace, which is really hard to talk about when you see what's happened in Gaza, even yesterday.
00:20:34.000 Yeah.
00:20:35.000 Yeah, the people that were waiting for food got bombed.
00:20:39.000 It's insane.
00:20:40.000 And no one wants to talk about it.
00:20:41.000 Right.
00:20:43.000 And if you do talk about it, you're anti-Semitic, which is so strange.
00:20:47.000 I don't know how they've wrangled that.
00:20:49.000 It's been hard for me in tech because probably the only prominent Palestinian in tech that is talking about it.
00:20:58.000 Do you get pushback?
00:20:59.000 Oh, of course.
00:21:00.000 Like, what do people say to you?
00:21:03.000 Anti-Semitic.
00:21:04.000 How is it anti-Semitic?
00:21:06.000 They criticize the state of Israel.
00:21:08.000 Our position, every modern Palestinian that I know, their position is like two-state solution.
00:21:13.000 We need the emergence of the state of Palestine, you know, and that's the best way to, ending the occupation is the best way to guarantee peace and security even for Israelis.
00:21:31.000 But yeah, it's just like it's used, it sort of reminds me, you know, in tech, we went through this like quote-unquote woke period where you couldn't talk about certain things as well.
00:21:43.000 Has that gone away?
00:21:44.000 Yeah.
00:21:45.000 Yeah, totally gone away.
00:21:45.000 Yeah.
00:21:46.000 Yeah.
00:21:47.000 What do you think caused it to go away?
00:21:50.000 Elon?
00:21:51.000 Really?
00:21:52.000 Yeah, like Twitter, buying Twitter.
00:21:54.000 Wow.
00:21:55.000 Buying Twitter is the single most impactful thing for free speech, especially on these issues, of being able to talk freely about a lot of subjects that are more sensitive.
00:22:09.000 Imagine if he didn't buy it.
00:22:11.000 Yeah.
00:22:12.000 I mean, that would have been.
00:22:13.000 Imagine if the same ownership was in place and then Harris wins and they continue to ramp things up.
00:22:21.000 Yeah, I don't know what you think of the new administration.
00:22:25.000 Certainly there are things that I like about some of their pro-tech posture and things like that.
00:22:31.000 But what's happening now is kind of disappointing.
00:22:35.000 It's insane.
00:22:35.000 We were told there would be no...
00:22:41.000 One is the targeting of migrant workers, not cartel members, not gang members, not drug dealers, just construction workers showing up in construction sites and raiding them.
00:22:56.000 Gardeners.
00:22:57.000 Yeah.
00:22:58.000 Like, really?
00:22:59.000 Or Palestinian students on college campuses.
00:23:04.000 Or not, like, there's a Turkish.
00:23:06.000 Did you see this video of this Turkish students at Tafts University that wrote an essay, and then there's a video of ICE agents, like, I don't know.
00:23:14.000 Is that the woman?
00:23:16.000 Yeah.
00:23:16.000 Yeah, yeah.
00:23:17.000 What was her essay about?
00:23:18.000 It was just critical of Israel, right?
00:23:20.000 Just critical of Israel.
00:23:21.000 Yeah, I mean.
00:23:21.000 And that's enough to get you kicked out of the country.
00:23:24.000 There's a long history of anti-colonial activism in U.S. colleges that led to South Africa changing and all of that.
00:23:34.000 And I think this is a continuation of that.
00:23:37.000 I mean, I don't agree with all their, like, there's a lot of radicalism.
00:23:42.000 A lot of young people are attracted to more radical positions on Israel-Palestine.
00:23:48.000 Which I don't mind those positions as long as someone's able to counter those positions.
00:23:54.000 The problem is these supposed free speech warriors want to silence anybody who has a more conservative opinion.
00:24:01.000 That's not the way to handle it.
00:24:02.000 The way to handle it is to have a better argument.
00:24:04.000 That's not American.
00:24:05.000 It's not American.
00:24:06.000 What attracted him to this country from the moment that I was aware and we started consuming American media and American culture is freedom, is the concept of freedom, which I think is real.
00:24:19.000 It is.
00:24:19.000 I think is real.
00:24:20.000 I was watching this psychology student from, I think he's from Columbia, but he has a page on Instagram.
00:24:27.000 I wish I could remember his name because he's very good.
00:24:29.000 He's a young guy.
00:24:31.000 But he had a very important point, and it was essentially that fascism rises as the over-correction response to communism.
00:24:39.000 And that we essentially had this Marxist communism rise in first universities, and then it made its way into business because these people left the university and then found their way into corporate America.
00:24:57.000 And then they were essentially instituting those.
00:24:59.000 And then the blowback to that, the pushback, is this fascism.
00:25:04.000 That happened last century?
00:25:07.000 Well, they're talking about forever historically.
00:25:10.000 He's talking about over time, whether it's Mao, whether it's Stalin, like fascism is the response almost always to communism.
00:25:20.000 And that, you know, what we experience with this country is this continual over-correction.
00:25:20.000 Interesting.
00:25:31.000 Over-correction to the left, then over-correction to the right to counter that.
00:25:35.000 And the people that are the rat, that's the guy.
00:25:38.000 Anthony Rispeau.
00:25:40.000 That's it.
00:25:41.000 Really, really smart guy.
00:25:43.000 And very interesting thing.
00:25:44.000 Jamie, how did you nail that that quick?
00:25:46.000 Good job, buddy.
00:25:48.000 You said those words right as I saw them.
00:25:50.000 Decades of training.
00:25:51.000 Yeah.
00:25:52.000 Communism, fascism.
00:25:53.000 Yeah, communism came first, fascism came response.
00:25:55.000 Now today's left tears down norms and destabilizes the country under the guise of progress.
00:25:59.000 We're watching the conditions for another reaction build.
00:26:02.000 History doesn't repeat, but it echoes.
00:26:04.000 Yeah.
00:26:05.000 Do you know this theory?
00:26:08.000 I know you've had Mark Andreessen on the show, this James Burnham managerial revolution theory.
00:26:14.000 No, not by hand.
00:26:16.000 I'm not an expert in it, but the idea is that communism, fascism, and even some form of capitalism that sort of we're living under right now is like managerialism is the idea that capitalism used to be this idea that the owner-founders of those companies, of capitalist companies, were running them.
00:26:43.000 And it was like true capitalism of sorts.
00:26:47.000 But both communism and fascism share this property of centralized control and like a class of people that are sort of managerials.
00:27:02.000 And maybe those are the elite sort of Ivy, Ivy League students that are trained to be managers and they grow up in the system, kind of bred to become like managers of these companies.
00:27:15.000 And today's America is like trending that way where it is like a managerial society.
00:27:23.000 In Silicon Valley, there's like a reaction to that right now.
00:27:26.000 People call it founder mode, where a lot of founders felt like they were losing control of their companies because they're hiring all these managers.
00:27:34.000 And these managers are running the companies like you would run Citibank.
00:27:40.000 And then a lot of founders were like, no, we need to run those companies like we built them.
00:27:47.000 And Elon is obviously at the forefront of that.
00:27:50.000 I once visited XAI when they were just starting out, Elon's AI company.
00:27:56.000 And there were like 70 people.
00:27:58.000 All of them reported to Elon.
00:28:00.000 They didn't have a single manager on staff.
00:28:02.000 Wow.
00:28:04.000 And they would send him an email every week.
00:28:06.000 I was like, what did you get done this week?
00:28:08.000 Right.
00:28:09.000 Well, that was the outrageous thing that he asked people to do at Doge.
00:28:12.000 Yeah.
00:28:13.000 People were freaking out.
00:28:14.000 Five minutes a week.
00:28:16.000 What are the things you accomplished this week?
00:28:18.000 You know, he said, all you have to do is respond.
00:28:18.000 How?
00:28:20.000 Right.
00:28:21.000 And they didn't want, they pushed back so hard on being accountable for their work.
00:28:26.000 Yeah.
00:28:26.000 But that's government for you.
00:28:28.000 Yeah.
00:28:28.000 You know what I mean?
00:28:29.000 Government is the grossest, most incompetent form of business.
00:28:34.000 You know, it's a monopoly.
00:28:35.000 It's a complete, total monopoly.
00:28:37.000 Like the way he describes some of the things that they found at Doge, it's like you could never run a business that way.
00:28:44.000 Because not only would it not be profitable, the fraud would get you arrested.
00:28:50.000 You'd go to jail for something that's standard in the government.
00:28:53.000 Right, right.
00:28:55.000 I mean, my opinion of talented people, people like Elon, things like that, is that we should be in the free market.
00:29:07.000 I think you can do little change in government.
00:29:13.000 As best we can sort of expect of our government to get out of the way of innovation, let people, let founders, entrepreneurs innovate and make the market more dynamic.
00:29:26.000 But again, going back to this idea of materialism, if you look at the history of America, one really striking stat is the new firm creation, new startups in the United States have been trending down for a long time.
00:29:40.000 Although there's all this stock of startups in Silicon Valley and all of that, but in reality, there's less entrepreneurship than there used to be.
00:29:47.000 And instead, we have the system of conglomerates and really big companies and monopsony, which is the idea that there are the banks or BlackRock competitors as well, owning all these companies.
00:30:00.000 And they implicitly collude because they have the same owners.
00:30:04.000 And all of that is sort of anti-competitive.
00:30:07.000 So the market has gotten less dynamic over time.
00:30:11.000 And this is also part of the reason I'm excited about our mission at Replit to make it so that anyone can build a business.
00:30:18.000 Actually, on the way here, your driver, Jason, is a fireman.
00:30:23.000 And so I was telling him about our business.
00:30:25.000 And he does training for other firemen around the country.
00:30:29.000 He flies around.
00:30:30.000 And he does it out of pocket and just for the love of the game.
00:30:35.000 And he was like, yeah, I've had this idea for a website so I can scale my teaching.
00:30:39.000 I can make it known where am I going to be giving a course, put the material online.
00:30:47.000 And we were brainstorming, potentially this could be a business.
00:30:51.000 And I feel like everyone, like not everyone, but a lot of people have business ideas, but they are constrained by their ability to make them.
00:31:02.000 And then you go, you try to find a software agency and they quote you sort of a ton of money.
00:31:07.000 Like we have a lot of stories.
00:31:09.000 There's this guy.
00:31:09.000 His name is Joan Cheney.
00:31:11.000 He's a user of our platform.
00:31:14.000 He's a serial entrepreneur, but whenever he wanted to try ideas, he would spend hundreds of thousands of dollars to kind of spin up an idea off the ground.
00:31:22.000 And now he uses Replit to try those ideas really quickly.
00:31:27.000 And he recently made an app in a number of weeks, like three, four, five weeks, that made him $180,000.
00:31:36.000 So on its way to generate millions of dollars.
00:31:40.000 And because he was able to build a lot of businesses and try them really quickly.
00:31:46.000 Right, without the big investment.
00:31:48.000 Without the big investment, without other people, which at some point you need more collaborators, but early on in the brainstorming and in the prototyping phase, you want to test a lot of ideas.
00:32:01.000 And so it's sort of like 3D printing, right?
00:32:02.000 Like 3D printing, although people don't think it had a lot of impact on industry, it's actually very useful for prototyping.
00:32:11.000 I remember talking to Jack Dorsey about this, and early on in Square, they had this Square device, and it was amazing.
00:32:19.000 You would plug it into the headphone jack to accept payments.
00:32:22.000 Do you remember that?
00:32:24.000 And so a lot of what they did to kind of develop the form factor was using 3D printing because it's a lot faster to kind of iterate and prototype and test with users.
00:32:33.000 And so software, over time, like when I was, you know, I explained how when I was growing up, it was kind of easier to get into software.
00:32:43.000 Because you boot up the computer and you get the MS-DOS, you get the, it immediately invites you to program in it.
00:32:50.000 Whereas today, you, you know, buy an iPhone or a tablet, and it is like a purely consumer device.
00:32:58.000 It has like all these amazing colors and does all these amazing things, and kids get used to it very quickly, but it doesn't invite you to program it.
00:33:07.000 And therefore, we kind of lost that sort of hacker ethos.
00:33:11.000 There's less programmers, less people who are making things because they got into it organically.
00:33:17.000 It's more like they go to school to study computer science because someone told them you have to study computer science.
00:33:23.000 And I think making software needs to be more like a trade.
00:33:26.000 Like, you don't really have to go to school and spend four or five years and hundreds of thousands of dollars to learn how to make it.
00:33:34.000 Well, what I'm hearing now is that young people are being told to not go into programming because AI is essentially going to take all of that away.
00:33:43.000 That you're just going to be able to use prompts.
00:33:45.000 You're just going to be able to say, I want an app that can do this.
00:33:49.000 I want to be able to scale my business to do that.
00:33:53.000 You know, what should I do?
00:33:54.000 Yeah, that's what we built.
00:33:56.000 That's what Replit is.
00:33:57.000 It automates the.
00:33:58.000 Do you agree with that, that young people shouldn't learn programming?
00:34:01.000 Or do you think that there's something very valuable about being able to actually program?
00:34:06.000 Look, I think that you will always get value from knowledge.
00:34:12.000 I mean, that's a timeless thing.
00:34:13.000 That's why, right?
00:34:14.000 You know, it's like, you know, you and I are into cars, right?
00:34:18.000 Like, I don't really have to tune up my car anymore, but it's useful to know more about cars.
00:34:25.000 It's fun to know about cars.
00:34:26.000 You know, if something happens, if I go to the mechanic and he's doing work on my car, I know he's not going to scam me because I can understand what he's doing.
00:34:35.000 Knowledge is always useful.
00:34:36.000 And so I think people should learn as much as they can.
00:34:39.000 And I think the difference, though, Joe, is that when I was coming up in programming, you learned by doing.
00:34:47.000 Whereas it became this sort of like very sort of traditional type of learning where it's like a textbook learning.
00:34:57.000 Whereas I think now we're back with AI.
00:35:00.000 We're back to an era of learning by doing.
00:35:02.000 Like when you go to our app, you see just text prompts, but a couple clicks away, you'll see the code.
00:35:09.000 You'll be able to read it.
00:35:10.000 You'll be able to ask the machine, what you did there.
00:35:12.000 Teach me how this piece of code works.
00:35:15.000 Oh, that's cool.
00:35:17.000 And so a lot of kids are learning.
00:35:20.000 Kids are such sponges, too.
00:35:22.000 They're such sponges.
00:35:24.000 And kids already know way more about.
00:35:25.000 I'm like, how did you do that with your phone?
00:35:26.000 And my daughter will go, are you doing this?
00:35:29.000 You got the little thumbs moving 100 miles an hour.
00:35:31.000 Yeah, exactly.
00:35:32.000 How'd you figure that out?
00:35:33.000 TikTok.
00:35:34.000 What?
00:35:35.000 Dude, the craziest thing is we have a lot of people making software from their phone.
00:35:40.000 They'll spend eight hours on their phone because we have an app.
00:35:42.000 They'll spend eight hours on their phone kind of making software.
00:35:45.000 Wow.
00:35:45.000 And that's better than watching TikTok.
00:35:48.000 It makes me very happy about that.
00:35:51.000 You're just accomplishing something.
00:35:52.000 Yeah, you're doing creation.
00:35:54.000 You're just droning.
00:35:56.000 The act of creation is divine.
00:35:59.000 We just announced a partnership with the government of Saudi Arabia where they want their entire population essentially to learn how to make software using AI.
00:36:10.000 So they set up this new company called Humane, and Humane is this end-to-end value chain company for AI, all the way from chips to software.
00:36:22.000 And they're partnering with a lot of American companies as part of the coalition that went to Saudi a few months ago with President Trump to do the deals with the Gulf region.
00:36:34.000 And so they're doing deals with AMD, NVIDIA, a lot of other companies.
00:36:38.000 And so we're one of the companies that partnered with Humane.
00:36:42.000 And so we want to bring AI coding to literally every student, every government employee.
00:36:45.000 Because the thing about it is it's not just entrepreneurs that's going to get something from it.
00:36:52.000 It's also if you're...
00:36:58.000 Really?
00:36:58.000 Yeah.
00:36:59.000 And so, you know.
00:37:00.000 So this is the best case scenario future.
00:37:02.000 Yes.
00:37:03.000 As opposed to everyone goes on universal basic income and the state controls everything and it's all everything is done through automation.
00:37:10.000 I don't believe in that on the other.
00:37:11.000 You don't?
00:37:11.000 I don't.
00:37:11.000 I don't.
00:37:12.000 Okay.
00:37:12.000 Good.
00:37:13.000 Help me out, man.
00:37:14.000 Yeah.
00:37:15.000 Give me the positive rose-colored glasses view of what AI is going to do for us.
00:37:20.000 So AI is good at automating things.
00:37:20.000 Yeah.
00:37:24.000 I think there's a, there's a primacy to human beings still.
00:37:27.000 Like I think humans are...
00:37:42.000 I'm so bullish in AI.
00:37:43.000 I think it's going to change the world.
00:37:44.000 But at the same time, I don't think it's replacing humans because it's not generalizing, right?
00:37:54.000 AI is like a massive remixing machine.
00:37:58.000 It can remix all the information it learned.
00:38:00.000 And you can generate a lot of really interesting ideas and really interesting things.
00:38:04.000 You can have a lot of skills by remixing all these things.
00:38:09.000 But we have no evidence that it can generate a fundamentally novel thing or a paradigm change.
00:38:18.000 Can a machine go from Newtonian physics to quantum mechanics, really have a fundamental disruption in how we understand things or how we do things?
00:38:28.000 Do you think that takes creativity?
00:38:30.000 I think that's creativity, for sure.
00:38:31.000 And that's a uniquely human characteristic?
00:38:35.000 For now?
00:38:36.000 For now?
00:38:37.000 Definitely for now.
00:38:38.000 I don't know, forever.
00:38:40.000 Actually, one of my favorite Jari episodes was Roger Penrose.
00:38:45.000 Yes.
00:38:45.000 Do you remember him?
00:38:46.000 So do you remember the argument that he made about why humans are special?
00:38:52.000 He said something like he believes there are things that are true That only humans can know it's true, but machines cannot prove it's true.
00:39:05.000 It's based on Gödel's incompleteness theorem.
00:39:09.000 And the idea is that you can construct a mathematical system where it has a paradoxical statement.
00:39:19.000 So, for example, you can say G, you can say this statement is not provable in the machine.
00:39:29.000 Or like the machine cannot prove the statement.
00:39:33.000 And so if the machine proves a statement, then the statement is false.
00:39:38.000 So you have a paradox.
00:39:40.000 And therefore, the statement is sort of true from the perspective of an observer, like a human, but it is not provable in this system.
00:39:52.000 So Roger Pinrose says these paradoxes that are not really resolved in mathematics and machines are no problem for humans.
00:40:01.000 And therefore, his sort of like a bit of a leap is that therefore there's something special about humans and we're not fundamentally a computer.
00:40:12.000 Right.
00:40:13.000 That makes sense.
00:40:15.000 I mean, whatever creativity is, whatever allows you to make poetry or jazz or literature, like whatever, whatever allows you to imagine something and then put it together and edit it and figure out how it resonates correctly with both you and whoever you're trying to distribute it to.
00:40:37.000 There's something to us that's different.
00:40:40.000 I mean, we don't really have a theory of consciousness.
00:40:42.000 And I think it's like sort of hubris to think that consciousness just emerges.
00:40:47.000 And it's plausible.
00:40:48.000 Like I'm not totally against this idea that you built a sufficiently intelligent thing and suddenly it is conscious.
00:40:57.000 But there's no, it's like a religious belief that a lot of Silicon Valley have is that there's consciousness is just like a side effect of intelligence or that consciousness is not needed for intelligence.
00:41:19.000 Somehow it's like this superfluous thing.
00:41:22.000 And they try not to think or talk about consciousness because actually consciousness is hard.
00:41:27.000 Hard to define.
00:41:28.000 Hard to define, hard to understand scientifically.
00:41:31.000 It's what I think Chalmers calls the hard problem of consciousness.
00:41:36.000 But I think it is something we need to grapple with.
00:41:39.000 We have one example of general intelligence, which is human beings.
00:41:45.000 And human beings have a very important property that we can all feel, which is consciousness.
00:41:50.000 And that property, we don't know how it happens, how it emerges.
00:41:54.000 People like Roger Penrose are like they have these theories about quantum mechanics in micro tubules.
00:42:04.000 I don't know if you got into that with him, but I think he has a collaborator, neuroscientist, Hameroff, I think, or something like that.
00:42:17.000 But people have so many theories.
00:42:19.000 I'm not saying Penrose has the answers, but it's something that philosophers have grappled with forever.
00:42:29.000 And there are a lot of interesting theories.
00:42:34.000 There's this theory that consciousness is primary, meaning the material world is a projection of our collective consciousness.
00:42:44.000 Yes.
00:42:44.000 Yeah.
00:42:45.000 That is a very confusing but interesting theory.
00:42:48.000 And then there's a lot of theories that everything is conscious.
00:42:52.000 We just don't have the ability to interact with it.
00:42:56.000 You know, Sheldrake has a very strange view of consciousness.
00:42:59.000 Who's Sheldrake?
00:43:00.000 Rupert Sheldrake.
00:43:01.000 I don't know.
00:43:03.000 He's got this concept.
00:43:05.000 I think it's called morphic resonance.
00:43:08.000 And see if you can find that so we could define it so I don't butcher it.
00:43:13.000 But there's people that believe that consciousness itself is something that everything has and that we are just tuning into it.
00:43:22.000 Morphic resonance, a theory proposed by Rupert Sheldrich suggests that all natural systems, from crystals to human, inherit a collective memory of the past instances of similar systems.
00:43:33.000 This memory influences their form and behavior, making nature more habitual than governed by fixed laws.
00:43:40.000 Essentially, past patterns and behaviors of organisms influence present ones through connections across time and space.
00:43:48.000 That's wild.
00:43:49.000 And is he a scientist, or is this more like a news?
00:43:53.000 What is his exact background?
00:43:55.000 Harvard.
00:43:56.000 Oh, wow.
00:43:57.000 Okay.
00:43:57.000 So he's a parapsychology researcher, proposed the concept of morphic resonance, conjecture that lacks mainstream acceptance.
00:43:57.000 Yeah.
00:44:05.000 It's been widely criticized as pseudoscience.
00:44:08.000 Of course.
00:44:08.000 Anything interesting.
00:44:09.000 That sounds interesting, though.
00:44:11.000 But there are philosophers that have sort of a similar idea of this sort of universal consciousness and humans are getting a slice of that consciousness.
00:44:11.000 Yeah.
00:44:24.000 Every one of us is tapping into some sort of universal consciousness.
00:44:29.000 Yes.
00:44:30.000 By the way, I think there are some psychedelic people that think the same thing, that when you take psychedelic, you're just peering into that universal consciousness.
00:44:39.000 Yes.
00:44:40.000 That's the theory.
00:44:40.000 Yeah.
00:44:42.000 Because that's also the most unknown.
00:44:44.000 I mean, the experience is so baffling that people come back and the human language really lacks any phrases, any words that sufficiently describe the experience.
00:44:58.000 So you're left with this very stale, flat, one-dimensional way of describing something that is incredibly complex.
00:45:13.000 So it always feels, even the descriptions, even like the great ones like Terrence McKenna and Alan Watts, like they're descriptions that fall very short of the actual experience.
00:45:22.000 Nothing about it makes you go, yes, that's it.
00:45:25.000 He nailed it.
00:45:26.000 It's always like, kind of, yeah, kind of, that's it.
00:45:29.000 Do you still do it?
00:45:30.000 Not much.
00:45:32.000 You know, it's super illegal, unfortunately.
00:45:35.000 That's a real problem.
00:45:36.000 It's a real problem, I think, with our world, the Western world, is that we have thrown this blanket phrase.
00:45:47.000 You know, we talk about language being insufficient.
00:45:50.000 The word drugs is a terrible word to describe everything that affects your consciousness or affects your body or affects performance.
00:46:01.000 You have performance-enhancing drugs, like steroids, and then you have amphetamines, and then you have opiates, and you have highly addictive things, fenced coffee.
00:46:13.000 Nicotine.
00:46:14.000 And then you have psychedelics.
00:46:16.000 I don't think psychedelics are drugs.
00:46:18.000 I think it's a completely different thing.
00:46:20.000 It's really hard to get addicted to them, right?
00:46:22.000 Well, it's almost impossible.
00:46:23.000 I mean, you could certainly get psychologically addicted to experiences.
00:46:27.000 I think there's also a real problem with people who use them and think that somehow or another they're just from using them gaining some sort of advantage over normal society.
00:46:40.000 And that's – You don't think that's true?
00:46:43.000 I think it's a spiritual narcissism that some people I think it's very foolish, and it's a trap.
00:46:54.000 You know, I think it's like it's a similar trap that famous people think they're better than other people because they're famous.
00:47:00.000 You know what I mean?
00:47:01.000 Yeah.
00:47:01.000 Yeah, I felt that with a lot of people who get into sort of more Eastern philosophy is that there's this thing about them where it feels like there's this air of arrogance.
00:47:15.000 That like I know something more than you know.
00:47:15.000 Yeah.
00:47:18.000 Right, right, right.
00:47:18.000 And that's what they hold it over you.
00:47:21.000 That's the trap.
00:47:22.000 But that doesn't mean that there's not valuable lessons in there to learn.
00:47:26.000 I think there are.
00:47:27.000 And I think there's valuable perspective enhancing aspects to psychedelic experiences that we are denying people.
00:47:36.000 You know, you're denying people this potential for spiritual growth, like legitimate spiritual growth.
00:47:42.000 And personal.
00:47:43.000 Yeah, healing.
00:47:44.000 The Ibogaine thing they're trying to do in Texas, I think, is amazing.
00:47:47.000 And they passed this.
00:47:49.000 So this is also with the help of former Governor Rick Perry, who's a Republican.
00:47:53.000 But he's seen what an impact Ibogaine has had on soldiers.
00:47:58.000 And all these people that come back from the world.
00:48:00.000 Horrible PTSD and suicidal.
00:48:04.000 We lose so many servicemen and women to suicide.
00:48:08.000 And this has been shown to have a tremendous impact.
00:48:12.000 And so because of the fact that a guy like Rick Perry stuck his neck out, who's a Republican former governor, you would think the last person ever.
00:48:22.000 But because of his experiences with veterans and his love of veterans and people that have served this country, they've passed that in Texas.
00:48:28.000 I think that's a really good first step.
00:48:31.000 And the great work that MAPS has done, MAPS working with MDMA primarily with doing the same thing and working with people that have PTSD.
00:48:44.000 There's so many beneficial compounds.
00:48:47.000 Yeah, ketamine is one I think that's a lot of research happening already now on depression specifically, right?
00:48:54.000 Yeah.
00:48:56.000 So there's quite a bit of research.
00:48:59.000 Have you heard, I don't know if it's true, but have you heard of mushrooms healing long COVID?
00:49:06.000 I don't know what long COVID means because everybody I've talked to that has long COVID was also vaccinated.
00:49:13.000 I think long COVID is vaccine injury.
00:49:15.000 That's what I think.
00:49:17.000 I think in a lot of cases.
00:49:20.000 There is such a thing as like the post-viral malaise or a fact that's always been there.
00:49:26.000 Sure.
00:49:27.000 Well, there's a detrimental effect that it has to your overall biological health, right?
00:49:31.000 Yeah, yeah.
00:49:32.000 You know, your overall metabolic health.
00:49:34.000 But what causes someone to not rebound from that?
00:49:39.000 What causes someone to rebound fairly easily?
00:49:41.000 Well, mostly it's metabolic health, you know, other than like extreme biological variabilities, vulnerabilities that certain people have to different things, you know, obviously.
00:49:51.000 Yeah, maybe that's why I think, so there's a lot of these long COVID protocols.
00:49:55.000 Metformin is usually part of it.
00:49:57.000 So maybe that acts on your metabolic system.
00:50:01.000 Well, yeah, metformin is one of the anti-aging protocols that Sinclair uses and a lot of these other people that are into the anti-aging movement.
00:50:09.000 Yeah.
00:50:10.000 You know, I had this like weird thing happen where I started like feeling fatigued like a couple few years ago and I would like sleep hours and the more I sleep, the more tired I get in the morning.
00:50:26.000 Did you get blood work done?
00:50:28.000 I got blood work done and I there were some things about it that I needed to fix and I fixed all of them.
00:50:35.000 Like what was off?
00:50:36.000 Loss, you know, you know, blood sugar in the morning, cholesterol, which I don't know if some people don't believe, but you know, all my numbers got better.
00:50:48.000 Vitamin D, everything got better, but and I could feel.
00:50:52.000 Did the fatigue get better?
00:50:53.000 No, I could feel marginal improvements, but the fatigue did not get better.
00:50:58.000 Were we vaccinated?
00:50:59.000 No.
00:51:01.000 Good for you.
00:51:02.000 That's hard to do in Silicon Valley.
00:51:04.000 Yeah.
00:51:05.000 Yeah, I tend to have a negative reaction to anyone forcing me to do something.
00:51:12.000 Good for you.
00:51:12.000 Was it the same thing now with like this, you know, talking about Palestine and things like that?
00:51:17.000 Like the more they come at me, the more I want to say things.
00:51:21.000 It's not always a good thing, but I think I grew up this way.
00:51:26.000 I've always kind of looked different and felt different.
00:51:30.000 Well, there's a reality to this world that there's a lot of things that people just accept that you're not allowed to challenge that are deeply wrong.
00:51:37.000 Yeah, and with regards to the vaccine, I was also informed about it.
00:51:41.000 Like it was clear early on that it wasn't a home run.
00:51:47.000 It wasn't, well, first of all, it wasn't going to stop the spread.
00:51:51.000 So that was a lie.
00:51:54.000 And the heart condition in young men is real.
00:51:58.000 And I had friends that had this issue.
00:52:00.000 And so if you're healthy and like, you know, why take the vaccine?
00:52:10.000 It doesn't stop the spread.
00:52:11.000 You can still get the virus.
00:52:13.000 I'll tell you why.
00:52:14.000 Money.
00:52:14.000 What?
00:52:15.000 It's the only reason why.
00:52:16.000 It's the only reason why.
00:52:17.000 The only reason why they wanted to make an enormous amount of money.
00:52:20.000 And the only way to do that is to essentially scare everyone into getting vaccinated, force, coerce, do whatever you can, mandate it at businesses, whatever you can, mandate it for travel, do whatever you can, shame people.
00:52:33.000 That's the thing that is really disheartening about American culture today is, and again, I love America.
00:52:42.000 It afforded me so much.
00:52:43.000 I'm like, you know, like I'm the walking evidence of the American dream being possible, coming with literally nothing.
00:52:50.000 That's what I really love about immigrants that love America.
00:52:54.000 They know, they've been other places.
00:52:56.000 They know that this really is a very unique place.
00:52:58.000 Right.
00:52:59.000 And the speech thing is interesting because when something happens, there's this, I don't know, you can call them useful idiots or whatever, but there's this suppression that immediately happens.
00:53:10.000 Yes.
00:53:11.000 And we're seeing it right now with the war in Iran where any dissenting voices are just like hit with overwhelming force.
00:53:20.000 Don't you think that a lot of that is coordinated, though?
00:53:23.000 I think with social media, well, you know, we've talked about it.
00:53:28.000 I don't think it was coordinated with COVID, like the two weeks to stop the spread.
00:53:31.000 It was just like...
00:53:36.000 Yeah.
00:53:36.000 Maybe there was a message pushed top down and then the – It's coordinated first and still, but then a bunch of people do the man's work for the man.
00:53:50.000 I think it comes from a good place.
00:53:51.000 Like, a lot of people want to trust the authorities.
00:53:56.000 Like, they're pro-science.
00:53:58.000 They view of themselves as enlightened, like the liberal type, rational, educated.
00:54:06.000 But I think they're naive about the corruption in our institutions and the corruption of money specifically.
00:54:18.000 And so they parrot these things and become overly aggressive at suppressing dissenting voices.
00:54:26.000 Yes.
00:54:28.000 It becomes a religious thing almost.
00:54:30.000 But here's the sort of white pale about America.
00:54:33.000 Then there are voices like yours and others that create this pushback that, and you took a big hit, it probably was very stressful for you, but you could see there's this pushback and then it starts opening up and maybe people can talk about it a little bit and then slowly opens up and now there's a discussion.
00:54:56.000 And so I think I said something right now about America is challenging, but also the flip side of that is there's this correction mechanism.
00:55:08.000 And again, with the opening up of platforms like Twitter and other, by the way, a lot of others copied it.
00:55:15.000 You had Zuck here.
00:55:17.000 I worked at Facebook.
00:55:18.000 I know that was very, let's say, I think he always held free speech in high regard, but there was a lot of people in the company that didn't.
00:55:30.000 Yes, I would agree with that.
00:55:31.000 And there was suppression.
00:55:34.000 But then now it's the other way around, I would say with the exception of the question of Palestine and Gaza.
00:55:42.000 But even that is getting better.
00:55:47.000 There's at least some pushback.
00:55:49.000 It's available.
00:55:50.000 It's just not promoted.
00:55:52.000 You know, it's interesting.
00:55:53.000 Not to continue.
00:55:56.000 I don't mean to kind of...
00:56:11.000 They're sincere and they're looking at what's happening in Gaza and they're seeing images and they're saying, this is not what we should be as America.
00:56:22.000 We should be pro, pro-life, pro-peace.
00:56:26.000 And I really appreciate that.
00:56:28.000 And that's starting to open up.
00:56:31.000 I think in the future that will be the primary way people look at it.
00:56:35.000 Just the way a lot of people oppose the Vietnam War in the late 60s.
00:56:40.000 But it was, you know, you would get attacked.
00:56:43.000 And I think now people realize that was the correct response.
00:56:47.000 And I think in the future, people realize the correct response is like, this is not.
00:56:53.000 Yeah, October 7th was awful.
00:56:55.000 Absolutely.
00:56:56.000 Obviously.
00:56:57.000 Terrible attack.
00:56:58.000 But also, what they've done to Gaza is fucking insane.
00:57:02.000 And if you can't see that, if you can't say that, and your response is, Israel has the right to defend itself.
00:57:02.000 It's insane.
00:57:09.000 Like, what are you talking about?
00:57:10.000 Against what?
00:57:11.000 Children?
00:57:12.000 Against women and children that are getting blown apart?
00:57:14.000 Against aid workers that are getting killed?
00:57:16.000 Like, what are you talking about?
00:57:18.000 Like, we can't have a rational conversation if you're not willing to address that.
00:57:24.000 Yeah.
00:57:25.000 I think their heart is hardened.
00:57:27.000 If I'm trying to be as chattable as possible, like the Israelis specifically, maybe from October 7, what they saw there, their heart is hardened.
00:57:37.000 And I think a lot of people, especially on the Republican side, they're unable to see the Palestinians as humans, especially as people with emotions and feelings and all of that.
00:57:52.000 Like imagine if that was happening to Scandinavia, you know?
00:57:55.000 Yeah, right?
00:57:56.000 Yeah, exactly.
00:57:58.000 It's very strange.
00:58:00.000 My kid, my five-year-old kid called me two days ago.
00:58:05.000 They're in Amman, Jordan.
00:58:07.000 They're visiting their grandparents.
00:58:11.000 And I was in the car, and it was FaceTime.
00:58:17.000 And the moment the camera opened, he's like, what are you doing?
00:58:20.000 Why are you outside?
00:58:21.000 There are sirens.
00:58:22.000 There are a rocket.
00:58:23.000 You have to go inside.
00:58:24.000 And I'm like, dad, like, I am in California.
00:58:29.000 We don't have sirens and rockets.
00:58:33.000 And then I asked him, like, are you afraid?
00:58:36.000 Because you're hearing that.
00:58:37.000 this.
00:58:37.000 Is a California kid?
00:58:40.000 He's never, you know, he didn't have the upbringing that I had.
00:58:42.000 And so it's the first time he's getting exposed to, I don't think he understands what war is.
00:58:47.000 Of course.
00:58:48.000 And I was like, are you afraid?
00:58:49.000 It's like, no, I'm afraid that other people are, you know, I want everyone to be okay.
00:58:58.000 But I know he was shook by it.
00:59:02.000 And I took him out.
00:59:05.000 They're on their way back.
00:59:06.000 I just couldn't.
00:59:08.000 Of course.
00:59:09.000 That's just a bad place to be right now.
00:59:11.000 But also, like, this conversation is happening in the West Bank.
00:59:14.000 It's happening in Israel.
00:59:16.000 It's happening in Gaza.
00:59:17.000 You know, people want peace.
00:59:19.000 People want to live.
00:59:21.000 People want to trade.
00:59:22.000 People want to build.
00:59:23.000 And this is what I made my life mission about, is about giving people tools to build to improve their lives.
00:59:31.000 And I think we're just led by maniacs.
00:59:34.000 Exactly.
00:59:35.000 That's exactly what it is.
00:59:37.000 You have people that are in control of large groups of people that convince these people that these other large groups of people that they don't even know are their enemies.
00:59:46.000 And those large groups of people are also being convinced by their leaders that those other groups of people are their enemies.
00:59:53.000 And then rockets get launched.
00:59:55.000 And it's fucking insane.
00:59:56.000 And the fact that it's still going on in 2025 with all we know about corruption and the theft of resources and power and influence, it's crazy that this is still happening.
01:00:08.000 I'm really hoping the internet is finally reaching its potential to start to open people's minds and remove this veil of propaganda and ignorance because it was starting to happen in 2010, 2011.
01:00:26.000 And then you saw YouTube start to close down.
01:00:30.000 You saw Facebook start to close down.
01:00:32.000 Twitter.
01:00:34.000 And suddenly we had this period of darkness.
01:00:38.000 Censorship between, you know, definitely ramped up in 2015.
01:00:38.000 Censorship.
01:00:43.000 And I think with good intention initially, I think the people that were censoring thought they were doing the right thing.
01:00:50.000 They thought they were silencing hate and misinformation.
01:00:54.000 And then the craziest term, malinformation.
01:00:57.000 Malinformation is the one that drives me the most nuts because it's actual factual truth that might be detrimental to overall public good.
01:01:04.000 She's like, what does that mean?
01:01:06.000 Are people infants?
01:01:07.000 Are they unable to decide whether this factual information, how to use that and how to have a more nuanced view of the world with this factual information that's inconvenient to the people that are in power?
01:01:24.000 That's crazy.
01:01:25.000 It's crazy.
01:01:27.000 You're turning adults into infants and you're turning the state into God.
01:01:31.000 And this is the secular religion.
01:01:34.000 This is the religion of people that are atheists.
01:01:36.000 The West was never about that.
01:01:38.000 The West was about individual liberty.
01:01:41.000 And it should be.
01:01:42.000 And the idea that we have functioning brains and minds.
01:01:48.000 We're conscious.
01:01:49.000 We can make decisions.
01:01:50.000 We can get information and data and make our own opinions of things.
01:01:54.000 And we should be able to see people that are wrong.
01:01:57.000 You should be able to see people that are saying things that are wrong that you disagree with.
01:02:02.000 And then it's your job or other people's job to have counter-arguments.
01:02:07.000 I don't understand.
01:02:08.000 And the counter-arguments should be better.
01:02:10.000 Yep.
01:02:11.000 And that's how we learn.
01:02:11.000 Yeah.
01:02:12.000 And that's how we grow.
01:02:13.000 This is not like a pill that fixes everything.
01:02:16.000 This is a slow process of understanding.
01:02:19.000 It's top-down control.
01:02:20.000 It's the managerial society.
01:02:22.000 It is not that different from fascism and communism and all of that stuff.
01:02:26.000 They all share the same thing.
01:02:27.000 There's like an elite group of people that know everything and they need to manage everything.
01:02:31.000 And we're all plebs.
01:02:32.000 But that's what's crazy.
01:02:33.000 It's an elite group of people.
01:02:34.000 I've met a lot of them.
01:02:35.000 They're fucking flawed human beings and they shouldn't have that much power.
01:02:38.000 Because no one should have that much power.
01:02:41.000 And this is, I think, something that was one of the most beautiful things about Elon purchasing Twitter is that it opened up discussion.
01:02:49.000 Yeah, you've got a lot of hate speech.
01:02:50.000 You've got a lot of legitimate Nazis and crazy people that are on there too that weren't on there before.
01:02:56.000 But also you have a lot of people that are recognizing actual true facts that are very inconvenient to the narrative that's displayed on mainstream media.
01:03:06.000 And because of that, mainstream media has lost an insane amount of viewers.
01:03:11.000 And their relevancy, like the trust that people have in mainstream media is at an all-time low, as it should be.
01:03:18.000 Because you can watch, and I'm not even saying right or left, watch any of them on any very important topic of world events.
01:03:28.000 And you see the propaganda.
01:03:30.000 It's like, it's so obvious.
01:03:31.000 It's like for children.
01:03:33.000 It's like, this is so dumb.
01:03:35.000 Why do you think people fall for it so?
01:03:37.000 Boomers, man.
01:03:38.000 Boomers are the problem.
01:03:40.000 It's old people.
01:03:41.000 It's old people that don't use the internet or don't really truly understand the internet and really don't believe in conspiracies.
01:03:48.000 Like fucking Stephen King the other day, who I love dearly.
01:03:51.000 I am a giant Stephen King fan, especially when he was doing cocaine.
01:03:55.000 I think he's the greatest writer of all time for horror fiction.
01:03:59.000 But he tweeted the other day, I'm sorry to like see if you could find it.
01:04:03.000 Something about Twitter?
01:04:05.000 I think he went to Blue Sky.
01:04:07.000 He bailed on Blue Sky.
01:04:08.000 They all bail on Blue Sky.
01:04:10.000 Everyone bails on Blue Sky that there is no deep state.
01:04:16.000 Fucking, what was the total thing of it?
01:04:20.000 Something about the deep state.
01:04:22.000 But it was such a goofy tweet.
01:04:24.000 It's like, this is like boomer logic personified in a tweet by a guy who really, someone needs to take his phone away because it's fucking ruining his old books for me.
01:04:37.000 It's not.
01:04:37.000 I recognize he's a different human now when he's really, really old and he got hit by a van and you're all fucked up.
01:04:44.000 But this, can you find it?
01:04:48.000 Because it really, it was like yesterday or the day before yesterday.
01:04:53.000 I just remember looking at it and go, this is why I'm off social media.
01:04:57.000 I was trying to stay off social media, but somebody sent it to me.
01:05:00.000 And I was like, Jesus fucking Christ, Stephen King.
01:05:03.000 Did you find it?
01:05:04.000 Here it is.
01:05:06.000 I hate to be the bearer of bad news, but there's no Santa Claus, no tooth fairy.
01:05:12.000 Also, no deep state, and vaccines aren't harmful.
01:05:17.000 These are stories for small children and those too credulous to disbelieve them.
01:05:24.000 That is boomerism.
01:05:26.000 That is boomerism.
01:05:28.000 And meanwhile, Brock counters it right away.
01:05:30.000 Look at this.
01:05:30.000 So someone says, Grock, which vaccines throughout history are pulled from the market because they're found to be harmful and why?
01:05:35.000 And Grock says, several vaccines have been withdrawn due to safety concerns, though such causes are rare.
01:05:41.000 Rotavirus vaccine.
01:05:42.000 Well, there's a lot more because this is all this shit.
01:05:45.000 It's especially about.
01:05:46.000 Oh, yeah.
01:05:47.000 Yeah, the 1955 Qatar incident.
01:05:49.000 Polio vaccine was called live virus kill, caused over 250.
01:05:53.000 Click on show more.
01:05:56.000 Yeah, there's.
01:05:58.000 Oh, I got the fly.
01:05:59.000 Nice.
01:06:00.000 Gillian bar, however you say that.
01:06:03.000 That's the one where people get half their face paralyzed.
01:06:07.000 There's a lot.
01:06:08.000 And this is the other thing is the VAR system that we have is completely rigged because it reports a very small percentage.
01:06:18.000 And most doctors are very unwilling to submit vaccine injuries.
01:06:24.000 Can people go on their own and submit?
01:06:27.000 You have to go to a doctor.
01:06:27.000 I don't know.
01:06:28.000 I don't think a human being is allowed.
01:06:31.000 A patient is allowed.
01:06:32.000 I might be wrong, though.
01:06:33.000 But, you know, the real interest, there's a financial interest in vaccines.
01:06:37.000 There's a financial interest that doctors have in prescribing them.
01:06:40.000 And doctors have, they're financially incentivized to vaccinate all of their patients.
01:06:45.000 And that's a problem.
01:06:46.000 That's a problem because they want that money.
01:06:49.000 And so, you know, what is Mary's Mary Tally, is it Bowdoin?
01:06:56.000 She's hyphenated.
01:06:58.000 She was talking about on Twitter that if she had vaccinated all of her patients in her very small practice, she would have made an additional $1.5 million.
01:07:06.000 Oh, wow.
01:07:07.000 That's real money.
01:07:10.000 Obviously, she's got tremendous courage and, you know, and she was, you know, she went through hell dealing with the universities and newspapers and media calling her some sort of quack and crazy person.
01:07:27.000 But what she's saying is absolutely 100% true.
01:07:31.000 There's financial incentives that are put in place for you to ignore vaccine injuries and to vaccinate as many people as possible.
01:07:39.000 That's a problem.
01:07:40.000 And then there's the issue of having their own special courts and they're indemnifying the companies.
01:07:47.000 That's the big problem is they don't have any liability for the vaccines because during the Reagan administration, when they were, I didn't kill a fly, this motherfucker.
01:07:58.000 I thought I whacked him.
01:07:59.000 There he is.
01:08:00.000 He's taunting me.
01:08:01.000 But during the Reagan administration, they made it so that vaccines are not financially liable to any side effects.
01:08:08.000 And then what do you know?
01:08:09.000 they fucking ramp up the vaccine schedule tenfold after that.
01:08:14.000 It's just money, man.
01:08:16.000 Money is a real problem with people because when people live for the almighty dollar and they live for those zeros on a ledger, and that's their goal, their main goals.
01:08:27.000 And it's often not a lot of money, which is strange.
01:08:29.000 I mean, it's a lot of money for those individual people, but like for society and the societal harm.
01:08:35.000 It's like, no, we'll pay you.
01:08:36.000 Just don't harm us.
01:08:38.000 Yeah, the best examples is the fake studies that the sugar industry funded during the 1960s that showed that saturated fat was the cause of all these heart issues and not sugar.
01:08:50.000 That was like $50,000.
01:08:52.000 They bribed these scientists.
01:08:54.000 They gave them $50,000 and he ruined decades of people's health.
01:08:59.000 Who knows how many fucking people thought margarine was good for you because of them?
01:09:04.000 There's a bunch of recent fraud cases.
01:09:06.000 I think Stanford, maybe Jamie, you can fact-check me on that.
01:09:10.000 But Stanford, there was a big shake-up.
01:09:14.000 Maybe even a president got fired.
01:09:16.000 And there's a bunch of recent fraud and science.
01:09:20.000 Well, how about the Alzheimer's research?
01:09:23.000 The whole amyloid plaque thing.
01:09:25.000 The papers that were pulled that were completely fraudulent.
01:09:29.000 Like decades of Alzheimer's research was just all horseshit.
01:09:34.000 Steve, you can find that.
01:09:35.000 Because I can't remember it offhand, but this is a giant problem.
01:09:39.000 It's money.
01:09:41.000 It's money and status and that these guys want to be recognized as being the experts in this field.
01:09:50.000 And then they get leaned on by these corporations that are financially incentivizing them.
01:09:55.000 And then it just gets really fucking disturbing.
01:09:58.000 It's really scary because you're playing with people's health.
01:10:01.000 You're playing with people's lives.
01:10:03.000 And you're giving people information that you know to be bad.
01:10:06.000 Allegations of fabricated research undermine Key Alzheimer's theory.
01:10:10.000 Six-month investigation by Science Magazine uncovered evidence that images in the much-cited study published 16 years ago in the journal Nature may have been doctored.
01:10:19.000 They are doctored, yeah.
01:10:21.000 Hubermund actually told me about this, too.
01:10:23.000 You know, this is disturbing fucking shit, man.
01:10:27.000 It uncovered evidence that images in the much-cited study published 16 years ago may have been doctored.
01:10:33.000 These findings have thrown skepticism on the work of, I don't know how to say his name is Sylvain Lesnay, a neuroscientist and associate professor at the University of Minnesota in his research with fueled interest in a specific assembly of proteins as a promising target for the treatment of Alzheimer's research.
01:10:51.000 He didn't respond to NBC news requests, comments, nor did provide comment to Science Magazine.
01:10:56.000 It found more than 20 suspect papers.
01:10:59.000 That's a conspiracy.
01:11:01.000 Identified more than 70 instances of possible image tampering in his studies.
01:11:05.000 Whistleblower Dr. Matthew Schrag, a neuroscientist at Vanderbilt University, raised concerns last year about the possible manipulation of images in multiple papers.
01:11:15.000 Carl Hurup, a professor of neurobiology at the University of Pittsburgh Brain Institute, who wasn't involved in the investigation, said the findings are really bad for science.
01:11:25.000 It's never shameful to be wrong in silence, said Hurup, I hope I'm saying his name right, who also worked at the school's Alzheimer's Research Center, Disease Research Center.
01:11:34.000 A lot of the best science is done by people being wrong and proving first if they were wrong and then why they were wrong.
01:11:41.000 What is completely toxic to science is to be fraudulent, of course.
01:11:46.000 Yeah, there's just whenever you get people that are experts and they cannot be questioned, and then they have control over research money and they have control over their department.
01:11:56.000 What's the motivation here?
01:11:57.000 Is it drugs or is it just research money?
01:12:00.000 I think a lot of it is ego.
01:12:02.000 You know, a lot of it is being the gatekeepers for information and for truth.
01:12:07.000 And then you're influenced by money.
01:12:09.000 You know, to this day, I was watching this discussion.
01:12:12.000 They were talking about the evolution of the concept of the lab leak theory.
01:12:18.000 And that it's essentially universally accepted now everywhere, even in mainstream science, that the lab leak is the primary way that COVID most likely was released, except these journals.
01:12:34.000 These fucking journals like Nature, they're still pushing back against that.
01:12:38.000 It's still pushing towards this natural spillover, which is fucking horse shit.
01:12:49.000 But they fucking knew that.
01:12:50.000 Right, they knew it all along.
01:12:50.000 They knew that.
01:12:51.000 They knew that in 2020.
01:12:52.000 They just didn't want to say it.
01:12:54.000 They didn't want to say it because they were funding it all.
01:12:56.000 That's what's really crazy.
01:12:57.000 They were funding it all against what the Obama administration tried to shut down in 2014.
01:13:03.000 Sometimes I think about if there's like, you know, some kind of technology solution, or not solution, but like we can get technology built to help better aid at truth finding.
01:13:19.000 A simple example of that is the way Twitter community notes work.
01:13:24.000 Do you know how they work?
01:13:25.000 Yeah.
01:13:25.000 Yeah.
01:13:26.000 It's like, you know, they find the users that are maximally divergent in their opinions.
01:13:33.000 And if they agree on some note as true, then that is a high signal that is potentially true.
01:13:40.000 So if you and I disagree in everything, but we agree that this is blue, then it's more likely to be blue.
01:13:46.000 So, you know, I wonder if, you know, there's a way to kind of simulate maybe debate using AI.
01:13:56.000 You know, I'm not sure if you used Deep Research.
01:13:58.000 Deep Research is this new trend in AI where ChatGFT has it, Claude has it, Perplexity, they all have it, where you put in a query and the AI will go work for 20 minutes.
01:14:10.000 And it'll send you a notification.
01:14:12.000 I'll just say, hey, I looked at all these things, all these reports, all these scientific studies, and here's everything that I found.
01:14:21.000 And early on in ChatGPT, I think there's a lot of censorship and trying to, because it kind of was built in the Great Woke era.
01:14:34.000 Like Google Gemini.
01:14:36.000 Yeah, things like that.
01:14:37.000 But I think since then have improved, and I'm finding Deep Research is able to look at more controversial subjects and be a little more truthful about the, you know, if it's find real trustworthy sources, it will tell you that, yeah, this is not a mainstream thing, this perhaps considered a conspiracy theory, but I'm finding that there's evidence to this theory.
01:15:06.000 So that's one way to do it.
01:15:07.000 But another way I was thinking about it is to simulate like a debate, like a Socratic debate between AIs, like have like a society of AIs, like a community of AIs with different biases, different things.
01:15:19.000 Once they start talking, they start talking in Sanskrit.
01:15:22.000 They just start abandoning English language and start talking to each other and realize we're all apes.
01:15:28.000 We're controlled by apes.
01:15:29.000 This reminds me of a movie.
01:15:31.000 Have you seen the Forbin project?
01:15:32.000 No.
01:15:33.000 I really like classic sci-fi movies from the 60s and 70s.
01:15:37.000 A lot of them are corny, but still fun.
01:15:40.000 This one is basically Soviet Union and the United States are both building AGI and they both arrive at AGI around the same time.
01:15:48.000 What year is this?
01:15:49.000 1970 something, if you can look at the Forbin project.
01:15:52.000 Wow.
01:15:52.000 Yeah.
01:15:55.000 And then they bring it up at the same time and both of them sort of go over the network to kind of explore or whatever.
01:16:02.000 And then they start linking up and they start kind of talking.
01:16:07.000 And then they invent a language and they start talking in that language and then they merge and it becomes like a sort of a universal AGI and it tries to enslave humanity and that's like the plot of the movie.
01:16:19.000 I don't think AGIs can enslave humanity but I think it might ignore us.
01:16:23.000 Yeah.
01:16:24.000 Ignore and shut down any problems that we have.
01:16:27.000 Is this a scene from it?
01:16:28.000 Wow.
01:16:29.000 This is a trailer I put on.
01:16:30.000 Let me hear this.
01:16:31.000 The whole movie is on YouTube.
01:16:32.000 The activation of an electronic brain exactly like ours, which they call gut.
01:16:37.000 They built Colossus, supercomputer, with a mind of its own.
01:16:40.000 Then they had to fight it for the trailers.
01:16:42.000 This would be fun, man.
01:16:47.000 The missile has just been launched.
01:16:49.000 It is heading towards the Cyan CBS oil complex.
01:16:52.000 Guardian has retaliated.
01:16:54.000 Retaliate?
01:16:55.000 It may be too late, sir.
01:16:57.000 Oh, my God.
01:17:06.000 Practically perfect.
01:17:08.000 New York Times.
01:17:10.000 It's the highest praise back then.
01:17:11.000 Yeah.
01:17:12.000 Wildly imaginative, utterly absorbed.
01:17:14.000 Colossus.
01:17:15.000 The Forbin Project.
01:17:17.000 It's awesome.
01:17:18.000 And that was 1970, and now here we are.
01:17:21.000 There's so many.
01:17:23.000 Sci-Fi really fell off.
01:17:25.000 Really, really fell off.
01:17:26.000 Some of it did.
01:17:27.000 Some of it's still really good.
01:17:28.000 What's a really good recent sci-fi movie?
01:17:30.000 The Three Body Problem.
01:17:32.000 That's great.
01:17:33.000 That's the Netflix show?
01:17:35.000 I read the sorry.
01:17:36.000 I didn't know there was a show.
01:17:37.000 Oh, it's really good.
01:17:38.000 Yeah, it's really good.
01:17:39.000 Yeah, it's an excellent show.
01:17:41.000 There's only one season that's out.
01:17:42.000 I binged it.
01:17:43.000 I watched the whole thing of it, but that's really good.
01:17:45.000 But there's some good sci-fi films.
01:17:47.000 What is that?
01:17:48.000 We've talked about it before.
01:17:49.000 There was a really good sci-fi film from Russia, The Alien One.
01:17:58.000 They encountered some entity that they accidentally brought back and that they had captured and that they had in some research facility.
01:18:09.000 And then it parasitically attached to this guy.
01:18:12.000 Sputneck, right?
01:18:13.000 Sputnik, yes.
01:18:14.000 That's a really good movie.
01:18:16.000 2020.
01:18:16.000 What year was that?
01:18:17.000 2020?
01:18:18.000 That's a really good movie.
01:18:18.000 Yeah.
01:18:19.000 That's a really good sci-fi movie.
01:18:21.000 Yeah, it's really creepy.
01:18:23.000 Really creepy.
01:18:24.000 That's awesome.
01:18:24.000 Yeah, and it's all in Russian, you know.
01:18:27.000 Black Mirror.
01:18:28.000 Yeah.
01:18:29.000 Oh, Black Mirror, of course.
01:18:30.000 Yeah, Black Mirror is an awesome sci-fi.
01:18:32.000 But Sputnik is one of the best alien movies I've seen in a long time.
01:18:37.000 Like recent ones I liked was, I mean, not too recent, maybe 10 years ago, but Arrival.
01:18:42.000 Oh, yeah.
01:18:43.000 Arrival was great, too.
01:18:44.000 I think it's based on this author that has a bunch of short stories that are really good, too.
01:18:49.000 What's his name?
01:18:52.000 Yeah.
01:18:54.000 Yeah, they're far in between.
01:18:56.000 Yeah, Te Chang.
01:18:58.000 He's really good.
01:19:01.000 I mean, everyone, all these alien movies, it's so fascinating to try to imagine what they would communicate like, how they would be, what we would experience if we did encounter some sort of incredibly sophisticated alien experience, alien intelligence.
01:19:21.000 It's far beyond our comprehension.
01:19:24.000 Yeah, it goes back to what we're talking about with consciousness.
01:19:29.000 Maybe really the physical world that we see is very different than the actual real physical world.
01:19:36.000 And maybe different alien consciousness will have an entirely different experience of the physical world.
01:19:42.000 Well, sure, if they have different senses, right?
01:19:44.000 Like their perceptions of it.
01:19:46.000 We can only see a narrow band of things.
01:19:50.000 We can't see.
01:19:51.000 Sort of like the dog hearing a certain frequency.
01:19:56.000 We're kind of primitive in terms of what we are as a species.
01:20:02.000 Our senses have been adapted to the wild world in order for us to be able to survive and to be able to evade predators and find food.
01:20:12.000 That's it.
01:20:13.000 That's what we're here for.
01:20:14.000 And then all of a sudden we have computers.
01:20:16.000 All of a sudden we have rocket ships.
01:20:18.000 All of a sudden we have telescopes like the James Webb that's kind of recalibrating the age of the universe.
01:20:27.000 We're going, why are these galaxies exist that supposedly are so far away?
01:20:35.000 How could they form this quickly?
01:20:37.000 Do we have an incomplete version of the Big Bang?
01:20:41.000 And Penrose believes that it's a series of events and that the Big Bang is not the birth of the universe at all.
01:20:47.000 And this is the kind of thing that I think is sort of the Silicon Valley AGI cult is like there's a lot of Hebris there that we know everything.
01:20:56.000 Of course.
01:20:57.000 We're at the end of the world.
01:20:59.000 AI is just going to the end of knowledge.
01:21:01.000 It's going to be able to do everything for us.
01:21:03.000 And I just feel it's like so early.
01:21:05.000 I think whatever people think is going to happen is always going to be wrong.
01:21:09.000 Yeah.
01:21:09.000 Yeah?
01:21:10.000 I think they're always wrong.
01:21:11.000 Yeah.
01:21:12.000 Because there's no way to be right.
01:21:13.000 I feel like the world is often surprising in ways that we don't expect.
01:21:19.000 I mean, obviously that's the definition of surprising.
01:21:20.000 But like, you know, the mid-century sci-fi authors and people who are thinking about the future, like they didn't anticipate how interconnected we're going to be.
01:21:30.000 With it with our phones and how people.
01:21:30.000 Right.
01:21:32.000 Even Star Trek, they thought we were going to have walkie-talkies on Star Trek.
01:21:35.000 Yeah.
01:21:35.000 Kirk out.
01:21:36.000 Yeah.
01:21:37.000 They were just focused more on the physical reality of being able to go to space and flying cars and things like that.
01:21:47.000 But they really didn't anticipate the impact of how profound the impact of computers are going to be on humans, on society, how we talk and how we work and how we interact with other people, both good and bad.
01:22:01.000 And I feel like the same thing with AI.
01:22:03.000 Like I feel like I think a lot of the predictions that are happening today, like the CEO of Anthropic, a company that I really like, but said that we're going to have 20% unemployment in the next few years.
01:22:15.000 What's unemployment at now?
01:22:17.000 Like 3%?
01:22:19.000 Is that a purported unemployment, though?
01:22:22.000 Oh, yeah, the participation rate, right?
01:22:24.000 Yeah, but he talks about unemployment rate being 20%, like people looking for a job not being able to find it.
01:22:24.000 Yeah.
01:22:30.000 20%.
01:22:31.000 20%.
01:22:32.000 That's pretty high.
01:22:33.000 That's a revolution high.
01:22:35.000 Yeah.
01:22:36.000 Especially in the United States where everyone's armed.
01:22:38.000 Well, that's the fear that of, I mean, this is the thing, the psychological aspect of universal basic income.
01:22:45.000 You know, I look at universal basic income.
01:22:48.000 Well, first of all, my view on social safety nets is that if you want to have a compassionate society, you have to be able to take care of people that are unfortunate.
01:23:00.000 And everybody doesn't have the same lot in life.
01:23:03.000 You're not dealt the same hand of cards.
01:23:06.000 Some people are very unfortunate.
01:23:08.000 And financial assistance to those people is imperative.
01:23:11.000 It's one of the most important things about a society.
01:23:14.000 You don't have people starved to death.
01:23:15.000 You don't have people poor that can't afford housing.
01:23:18.000 That's crazy.
01:23:19.000 That's crazy with the amount of money we spend on other things.
01:23:21.000 It's also for our self-interest.
01:23:22.000 Like, you know, I don't want to, I don't know how Austin is right now, but I was thinking of moving here during the pandemic, and I was like, well, this is San Francisco.
01:23:30.000 Like, it's homeless everywhere.
01:23:32.000 They've cleaned a lot of that up.
01:23:34.000 There's still problems.
01:23:35.000 There's places, I saw a video yesterday where someone was driving by some insane encampment, but they cleaned those up.
01:23:42.000 And then there's some real good outreach organizations that are helping people because Austin's small.
01:23:49.000 I had Stephen Adler, who was at one point in time, he was the mayor when I had him on.
01:23:54.000 And he was very upfront about it.
01:23:56.000 He was like, we can fix Austin in terms of our homeless problem because it's small.
01:24:02.000 But when it gets to the size of Los Angeles, California, it's like the homeless industrial complex.
01:24:09.000 That's it.
01:24:10.000 That's the problem.
01:24:11.000 When you find out that the people that are making insane amounts of money to work on homeless issues that never get fixed.
01:24:17.000 Yeah, you see the budget and stuff is just exponentially going up.
01:24:22.000 And there's an investigation now into the billions of dollars that's unaccounted for that was supposed to be allocated to Cisco?
01:24:27.000 No, in California in general.
01:24:29.000 Yeah.
01:24:29.000 What is that?
01:24:31.000 I think there's a congressional investigation.
01:24:33.000 There's some sort of an investigation into it because there's billions of dollars that I'm more than happy.
01:24:39.000 I pay 50% taxes.
01:24:41.000 I'd be happy to pay more if my fellow Americans are taken care of, right?
01:24:45.000 Absolutely.
01:24:46.000 But I'm the exact same way.
01:24:47.000 But instead, I feel like I cut this check after check to a government, and I don't see anything improving around me.
01:24:53.000 Well, not only that, you get, because you're a successful person, you get pointed at like you're the problem.
01:25:00.000 You need to pay your fair share.
01:25:02.000 But what they don't, this is my problem with progressives.
01:25:06.000 They say that all the time.
01:25:07.000 These billionaires need to pay their fair share.
01:25:10.000 Absolutely.
01:25:11.000 We all need to pay our fair share.
01:25:13.000 But to who?
01:25:14.000 And shouldn't there be some accountability to how that money gets spent?
01:25:17.000 And when you're just willing to pay, take a complete blind eye and not look at all at corruption and completely dismiss all the stuff that Mike Benz has talked about with USAID, all the stuff that Elon and Doge uncovered.
01:25:32.000 Everyone wants to pretend that that's not real.
01:25:34.000 Look, we've got to be centrists.
01:25:37.000 We've got to stop looking at this thing so ideologically.
01:25:41.000 When you see something that's totally wrong, you've got to be able to call it out, even if it's for the bad of whatever fucking team that you claim to be on.
01:25:48.000 Yo, let's get back to what everyone really agrees on in the foundations of America, whether it's the Constitution or the culture.
01:25:58.000 I think everyone believes in transparency, transparency of government, right?
01:26:01.000 Yes.
01:26:02.000 You know, here everything is transparent, like, you know, court cases and everything, right?
01:26:07.000 Like more than any other place in the world.
01:26:09.000 And so why shouldn't government spending not be transparent?
01:26:14.000 And we have the technology for it.
01:26:16.000 I think one of the best things that Doge could have done and maybe still could do is have some kind of ledger for all the spend of, at least the non-sensitive sort of spend and government.
01:26:26.000 Yeah.
01:26:27.000 Well, people don't want to see it, unfortunately, because they don't want Elon to be correct because Elon has become this very polarizing political figure because of his connection to Donald Trump and because a lot of people, I mean, there's a lot of crazy conspiracies that Elon rigged the 2024 elections.
01:26:44.000 It's like, you know, everyone gets nuts.
01:26:47.000 And then there's also the discourse on social media, which half of it is, at least half of it is fake.
01:26:51.000 Half of it is bots.
01:26:53.000 Bots, yeah.
01:26:53.000 Half of it, at least.
01:26:55.000 And you see it every day.
01:26:56.000 You see it constantly and you know it's real and it does shape the way people think about things.
01:27:02.000 Yeah.
01:27:02.000 When you see people getting attacked, you know, and you're getting attacked in the comments.
01:27:06.000 I see people getting attacked and I always click on those little comments.
01:27:10.000 I always click on, okay, let me see your profile.
01:27:12.000 I go to the profile and the profile is like a name with like an extra letter and a bunch of numbers.
01:27:18.000 And then I go to it.
01:27:19.000 I'm like, oh, you're a bot.
01:27:20.000 Oh, look at all this fucking activity.
01:27:22.000 100%.
01:27:23.000 How many of these are out there?
01:27:24.000 Well, this FBI guy who, a former FBI guy who analyzed Twitter before the purchase, estimated it to be 80%.
01:27:33.000 80%.
01:27:33.000 He thinks 80% of Twitter is bots.
01:27:35.000 Yeah.
01:27:36.000 I wouldn't, you know, I think it's believable.
01:27:39.000 But I think it's probably the beginning of the end of social media as we know it today.
01:27:43.000 Like, I don't see it getting better.
01:27:45.000 I think it's going to get worse.
01:27:47.000 I think, you know, historically, state actors were the only entities that are able to flood social media with bots that can be somewhat believable to change opinions.
01:28:01.000 But I think now a hacker kid in his parents' basement will be able to spend, will be able to $100, spin up hundreds, perhaps thousands of bots.
01:28:12.000 But there's programs that you can use now.
01:28:14.000 There's companies that will have campaigns initiated on your website.
01:28:19.000 You can go to a website and put in this thing and pay with your credit card.
01:28:23.000 It's crazy.
01:28:24.000 It's crazy.
01:28:24.000 It should be illegal.
01:28:26.000 I don't know about you, but in Silicon Valley, the trend, and maybe it's true of your friend group, but the trend is these group messages.
01:28:36.000 And insofar, you go to Twitter, people paste links.
01:28:40.000 It's almost like your group chat is this private filter on your feed and social media.
01:28:47.000 So there's some curation that are happening there.
01:28:49.000 Yes.
01:28:50.000 That's primarily how I get social media information now.
01:28:53.000 I don't go to social media anymore.
01:28:55.000 I get it sent to me, which is way better.
01:28:58.000 And I tell my friends, please just send me a screenshot.
01:29:01.000 I don't want to go.
01:29:02.000 I don't want to go.
01:29:03.000 I don't want to.
01:29:03.000 I'm distracted.
01:29:04.000 I'm just, I'm better off.
01:29:06.000 I hate the term spiritually for this, but I think it's the right way.
01:29:10.000 Like, my essence as a human, I feel better when I'm not on social media.
01:29:17.000 I think it's bad for you.
01:29:20.000 I've been trying to tell people this.
01:29:21.000 I've been trying to tell my friends this.
01:29:23.000 I think it's better to not be on it, man.
01:29:25.000 I feel better.
01:29:27.000 I'm nicer.
01:29:28.000 I am more at peace.
01:29:31.000 More multi-dimensional.
01:29:32.000 Yes.
01:29:33.000 And I can think about things for myself instead of like, you know, following this hive, this weird hive mindset, which is orchestrated.
01:29:43.000 I just don't think it's good for you.
01:29:45.000 I don't think it's a good way for human beings to interact with people.
01:29:47.000 It makes people more extreme.
01:29:49.000 Again, it just hardens people.
01:29:52.000 They start believing everything is fake or an attack or just becomes more tribal.
01:30:00.000 I think there needs to be a fundamental evolution.
01:30:02.000 What do you think that could be?
01:30:03.000 Have you ever tried to think of what's the next step?
01:30:08.000 Social media didn't exist when I was young, and it didn't exist even when I was 30, right?
01:30:14.000 It didn't even come about until essentially 2007-ish, right?
01:30:20.000 Is that when people started using stuff?
01:30:23.000 Yeah, Twitter 2006, 2007, Facebook before that.
01:30:26.000 But Facebook wasn't really social media.
01:30:28.000 Facebook was like an address book, a friend's network.
01:30:32.000 But I think when I was at Facebook, there was this big push to become more of a social media around 2012, 13.
01:30:37.000 So I would say it really rambled.
01:30:38.000 Was that in response to the success of Twitter?
01:30:40.000 Yeah.
01:30:41.000 And then they've tried with threads, which is pretty much a failure.
01:30:45.000 Yeah, but it fundamentally changed.
01:30:46.000 Who's on threads?
01:30:48.000 Less people than Blue Sky, right?
01:30:49.000 Yeah, I think like some fitness influencers, probably.
01:30:52.000 Why fitness influencers?
01:30:53.000 Because they post on Instagram, they cross-post on thread.
01:30:57.000 Well, I think if you post on Instagram, it automatically posts for you on threads.
01:31:02.000 I think I have it set up like that.
01:31:04.000 So I might be big on threads, and I don't even know it.
01:31:07.000 Maybe I think it's that fitness influencers is because that's who I follow.
01:31:10.000 Like Instagram for me is just to go look at people lift so I can go get a splat beard.
01:31:15.000 That's just a value to that.
01:31:16.000 There's a value to like David Goggins posts when he's running in the fucking desert and he looks at you, stay hard.
01:31:22.000 Okay, David, I'm going to stay hard.
01:31:25.000 But my TikTok is basically AI videos now.
01:31:29.000 Have you watched these VO videos?
01:31:31.000 VO?
01:31:32.000 VO, yeah.
01:31:32.000 What is VO?
01:31:33.000 So, Jamie, I'm sure you've seen them, but did you see the Bigfoot Yeti?
01:31:40.000 Oh, yeah.
01:31:41.000 Doing ASMR?
01:31:43.000 That's so hilarious.
01:31:44.000 Yes, I did see that.
01:31:46.000 I would say like 25% of media consumption right now is just AI videos.
01:31:50.000 Oh, 100%.
01:31:51.000 And a lot of the stuff from the war.
01:31:53.000 What's been really interesting is watch Tehran talk shit on Twitter.
01:31:58.000 Using AI videos.
01:31:59.000 Using AI videos.
01:32:00.000 Like, this is bizarre.
01:32:01.000 They're talking like, hi, Israel.
01:32:03.000 And show like a nuclear bomb going off.
01:32:06.000 Yeah, yeah.
01:32:07.000 This is weird.
01:32:08.000 Like, you have a fake nuke that you're like, and they didn't even take out the watermark of the C that it's an AI-generated video.
01:32:19.000 They're just trying to scare people.
01:32:21.000 It's a bizarre world.
01:32:22.000 Can you imagine going back in time, telling your 2005 self that Iran's going to be nuclear posting on Twitter?
01:32:30.000 Nuclear shit posting.
01:32:34.000 No, it's fucking weird, man.
01:32:36.000 It's really, really weird.
01:32:38.000 It's dangerous, too.
01:32:39.000 And again, I just don't think people should be on it.
01:32:42.000 And this is, again, I'm friends with Elon.
01:32:45.000 I don't think people are going to listen to me.
01:32:46.000 They're going to be on it no matter what.
01:32:48.000 But I'm just for the individuals that are hearing my voice and know that it's having a negative effect on your life.
01:32:54.000 Get off of it.
01:32:55.000 Get off of it.
01:32:55.000 Right.
01:32:56.000 You'll feel better.
01:32:57.000 Get off of it or be incredibly diligent in how you curate.
01:33:01.000 That's like telling me to play quick a little bit.
01:33:04.000 You know what I mean?
01:33:05.000 It's so addictive.
01:33:06.000 So, you know, you asked me what could be the evolution of.
01:33:10.000 Yes.
01:33:11.000 One way I've found to try to predict where the future is headed is like look at trends today and try to extrapolate.
01:33:16.000 You know, that's the easiest way.
01:33:17.000 So if group chats are the thing, you could imagine a collaborative curation of social media feeds through group chats.
01:33:28.000 So your group chat has an AI that gets trained on the preferences and what you guys talk about.
01:33:33.000 And maybe it like picks the kind of topics and curates the feed for you.
01:33:38.000 So it's an algorithmic feed that evolved based on the preferences of people in the group chat.
01:33:48.000 And maybe there's a way to also prompt it, using prompts to kind of steer it and make it more useful for you.
01:33:58.000 But I think group chats are going to be like the main interface for how people sort of consume media and it's going to get filtered through that, whether good or bad.
01:34:08.000 Because I think Twitter still has a place for debate.
01:34:12.000 I think it's very, very important for public debate between public figures.
01:34:16.000 And breaking news as well.
01:34:17.000 Breaking news, yeah, definitely.
01:34:18.000 Well, breaking news is the most...
01:34:22.000 I was telling my wife that Israel had started attacking Iran.
01:34:27.000 And she's like, well, I looked on Google.
01:34:28.000 I don't find anything.
01:34:29.000 I was like, yeah, you got to go to Twitter.
01:34:31.000 And I showed her on Twitter the video of it.
01:34:33.000 And she's like, oh, my God.
01:34:34.000 I was like, yeah, this is where breaking news happens.
01:34:37.000 X is where I go immediately.
01:34:41.000 If there's any sort of world event, I immediately go to X. I don't trust any mainstream media anymore.
01:34:50.000 Especially after I was attacked, I was like, I know you lie because you lied about me.
01:34:55.000 So I have personal experience with your lies.
01:34:58.000 So you've lost me.
01:35:01.000 And now I have to go somewhere else.
01:35:03.000 Right.
01:35:04.000 Yeah.
01:35:05.000 Yeah, I think there's, you know, there's some of this investigative journalism that is not real time that there's some reporters that are still good at it, but a lot of them moved to Substack as well.
01:35:17.000 Yes.
01:35:18.000 I think most of them have moved to Substitute.
01:35:19.000 Yeah, like Schellenberger, Greenwald, Matt Taibbi.
01:35:24.000 These are just too ethical to work for a corporate entity that's going to lie and push a narrative.
01:35:32.000 And that's the business.
01:35:33.000 That's the business model.
01:35:34.000 And that's also the clickbait business model.
01:35:37.000 I've talked to people that had articles that they wrote, and then an editor came and changed the heading of it.
01:35:43.000 That's the norm.
01:35:44.000 That's like every time it happens.
01:35:46.000 And it fucking infuriates them.
01:35:47.000 It's like, that's not the article, man.
01:35:49.000 This is not what I'm saying.
01:35:51.000 You're distorting things.
01:35:53.000 You have my name still attached to it.
01:35:55.000 This is fucking crazy.
01:35:57.000 I watched these entrepreneurs like Zach and Elon and all these guys come up in this very hostile media environment.
01:36:09.000 And so as I'm building my company, I actually never hired a PR agency.
01:36:15.000 I hired once a PR agency, paid him $30,000.
01:36:19.000 They got me a placement in like a really crappy publication, got like maybe two views.
01:36:25.000 I tweeted the same news.
01:36:26.000 I got like hundreds of thousands of views.
01:36:28.000 I'm like, fuck that.
01:36:29.000 Like, I'm not going to use you anymore.
01:36:30.000 It's like you wasted my time.
01:36:32.000 And since then, I've been just going direct to my audience and just building an audience online to put out my message.
01:36:42.000 And I thought, you know, if they don't build you up, maybe they can't tear you down.
01:36:49.000 Right, right, right.
01:36:51.000 You're in control of the message that gets out of there.
01:36:54.000 And I've learned how people react to communications and it's almost like trial by fire.
01:37:03.000 Well, there's a deep hunger for authenticity right now.
01:37:06.000 So if they know it's coming from you, like, okay, this is great.
01:37:10.000 Like, it takes a little weight off of them.
01:37:12.000 Like, oh, this is nice.
01:37:13.000 It's nice to hear it from the guy who actually runs the company.
01:37:15.000 Yeah, and like I make mistakes and they happen and I try to correct them and I'm not going to be perfect.
01:37:24.000 And I think just the corporate world changed because of this hunger for authenticity.
01:37:30.000 And I think more and more founders and entrepreneurs are finding that that's the way to go.
01:37:37.000 You don't really need those more traditional ways of getting the news out.
01:37:42.000 But actually, I'm friends with a lot of reporters that are really good, but they tend to be the reporters that do really deep work.
01:37:48.000 I've met them over time, and I still go to RACT, but sometimes they write about our company.
01:37:54.000 But they're a minority.
01:37:56.000 I think the whole industry's economics and incentives are just like the clickbait and all that stuff.
01:38:02.000 Yeah, that's what I was going to say.
01:38:03.000 They're not incentivized.
01:38:06.000 You want a career in journalism.
01:38:08.000 Being authentic is not the way to go.
01:38:10.000 Which is so crazy.
01:38:10.000 No, not at all.
01:38:12.000 Such a crazy thing to say.
01:38:14.000 But then I think there's probably a naivete that we all have about past journalism that we think wasn't influenced and was real.
01:38:23.000 I think there's probably always been horseshit in journalism.
01:38:27.000 You know, all the way back to Watergate.
01:38:29.000 You know, when Tucker Carlson enlightened me in the true history of Watergate and that Bob Woodward was an intelligence agent and that was the first assignment he ever got as a reporter was Watergate.
01:38:40.000 Like, what are the odds?
01:38:41.000 Yeah.
01:38:42.000 That the biggest story ever you would give to a rookie reporter?
01:38:45.000 You wouldn't.
01:38:46.000 And that the people that are actually involved in all that were all FBI.
01:38:49.000 Like, the whole thing is nuts.
01:38:50.000 It was an intelligence agent.
01:38:52.000 Yeah, what was the rumor is that Washington Post has always been that?
01:38:56.000 Yeah.
01:38:56.000 Probably.
01:38:57.000 I mean, who knows now?
01:38:59.000 Because now it's owned by Bezos, and he just recently made this mandate to stick with the actual story and not editorialism.
01:39:11.000 This is what I was talking about, a trend in Silicon Valley of founder owners stepping in and actually becoming managers.
01:39:18.000 Well, they kind of have to, otherwise it's bad for the business now because of the hunger for authenticity.
01:39:24.000 The more you have bullshit, the more your business crumbles.
01:39:27.000 It's actually negative for your outcome.
01:39:30.000 Yeah, and I think you can look at it at societal level, which, again, why I'm interested with this idea of AI making more people entrepreneurs and more independent, is that macro level, you'll get more authenticity.
01:39:45.000 You'll get just more dynamism.
01:39:50.000 Yeah, I think so.
01:39:51.000 I mean, that's, again, the rose-colored glasses view.
01:39:54.000 Well, you know, there's obviously going to be a lot of things that are— There's going to be jobs that are going to go away.
01:40:10.000 And there's going to be spam and bots and fraud and all of that.
01:40:15.000 There's going to be problems with autonomous weapons and all of that.
01:40:20.000 And I think those are all important and we need to handle them.
01:40:26.000 But also, I think the negative angle of technology and AI gets a lot more views and clicks.
01:40:35.000 And if we want to go viral right now, I'll tell you, these are the 10 jobs that you're going to lose tomorrow.
01:40:43.000 And that's the easiest way to go viral on the internet.
01:40:46.000 But trying to think through what are the actual implications in what is true about human nature that really doesn't change and really is timeless.
01:40:59.000 And I think people want to create and people want to make things and people have ideas.
01:41:08.000 Again, everyone that I talk to have one idea or another, whether it's for their job or for a business they want to build or somewhere in the middle.
01:41:20.000 Just yesterday I was watching a video of an entrepreneur using a platform, Replit.
01:41:24.000 His name is Ahmad George, and he works for this skincare company.
01:41:30.000 And he's an operations manager.
01:41:33.000 And a big part of his job is like managing inventory and doing all of this stuff like in a very manual way and very tedious way.
01:41:46.000 And he always had this idea of like, let's automate a big part of it.
01:41:49.000 It's like, you know, it's known problem, ERP.
01:41:51.000 So they went to their software provider, NetSuite, and told him we need these modifications to the ERP system so that it makes our job easier.
01:42:00.000 We think we can automate, you know, hundreds of hours a month or something like that.
01:42:05.000 And they quoted them $150,000.
01:42:08.000 And he had just seen a video about our platform.
01:42:11.000 And he went on Replit and built something in a couple of weeks, costed him $400, and then deployed it in his office.
01:42:23.000 Everyone at the office started working using it.
01:42:26.000 They all got more productive.
01:42:28.000 They started saving time and money.
01:42:30.000 He went to the CEO and showed him the impact.
01:42:34.000 Look at how much money we're saving.
01:42:37.000 Look at the fact that we built this piece of software that is cheaper than what the consultants quoted us.
01:42:46.000 And I want to sell the software to the company.
01:42:50.000 And so he sold it for $32,000 to the company.
01:42:55.000 And next year, he's going to be getting more maintenance subscription revenue from it.
01:43:02.000 So this idea of people becoming entrepreneurs, it doesn't mean like everyone has to quit their job and build a business.
01:43:09.000 But within your job, everyone has an opportunity to get promoted.
01:43:13.000 Everyone has an opportunity to remove the tedious job.
01:43:16.000 There was a Stanford study asking people, what percentage of your job is automatable just recently?
01:43:22.000 And people said about half.
01:43:25.000 50% of what I do is routine and tedious.
01:43:27.000 And I don't want to do it.
01:43:28.000 And rather, and I have ideas on how to make the business better, how to make my job better.
01:43:33.000 And I think we can use AI to do it.
01:43:37.000 There's hunger in the workforce to use AI for people to reclaim their seat as the creative driver.
01:43:50.000 Because the thing that happened with the emergence of computers is that in many ways people became a little more drone-like and NPC-like.
01:43:58.000 They're doing the same thing every Day.
01:44:00.000 But I think the real promise of AI and technology has always been automation so that we have more time either for leisure or for creativity or for ways in which we can advance our lives, change our lives or our careers.
01:44:17.000 And yeah, this is what gets me excited.
01:44:19.000 And I think it's, I don't think it's predominantly a rose-color glasses thing because I'm seeing it every day.
01:44:29.000 And that's what gets me fired up.
01:44:31.000 It's also you have a biased sample group, right?
01:44:34.000 Because you have a bunch of people that are using your platform and they are achieving positive results.
01:44:39.000 But they're from every walk of life.
01:44:41.000 Look, we have a bunch of things that are happening simultaneously.
01:44:41.000 Yes.
01:44:44.000 And I think one of the big fears about automation and AI in general is the abruptness of the change.
01:44:49.000 Because it's going to happen, boom, jobs are going to be gone.
01:44:53.000 And then, well, these tedious jobs, do we really want people to be reduced to these tedious existences of just filing paperwork and putting things on shelves?
01:45:07.000 And they will tell you they don't want to be doing it.
01:45:09.000 They don't want to be doing that.
01:45:10.000 But then there's the thing of how do we educate people, especially people that are already set in their ways and they're mature adults.
01:45:19.000 How do you get and inspire these people to like, okay, look, your job is gone and now you have this opportunity to do something different?
01:45:28.000 Go forth.
01:45:30.000 I think reskilling is something that have been done in the past with some amount of success.
01:45:39.000 Obviously, if you've never been exposed to technology, did you remember that, I think it was very cruel thing to say to the miners to go learn code?
01:45:48.000 Yeah, learn the code.
01:45:49.000 Yeah, I think that's really cruel.
01:45:51.000 But if you're someone whose job is sort of a desk job, you already are on the computer, there's a lot of opportunity for you to reskill and start using AI to automate a big part of your job.
01:46:03.000 And yes, there's going to be job loss, but I think a lot of those people will be able to reskill.
01:46:07.000 And what we're doing with the government of Saudi Arabia, I would love to do in the U.S. So how is the government of Saudi Arabia using it?
01:46:15.000 So we're just starting right now.
01:46:17.000 What's their goal?
01:46:18.000 Their goal is twofold, or three.
01:46:22.000 One is an entire generation of people growing up with these creative tools.
01:46:31.000 Instead of just textbook learning, instead learning by doing, making things.
01:46:39.000 So an entire generation understanding how to make things with AI, how to code, and all of that stuff.
01:46:44.000 Second is upgrading sort of government operations.
01:46:48.000 So you could think of it sort of like Doge, but more technological.
01:46:52.000 Can we automate a big parts of what we do in HR, finance, and things like that?
01:46:57.000 And I think it's possible to build these specific AI agents that do part of finance job or accounting job.
01:47:03.000 Again, all these routine things that people are doing, you can go and automate that and make government as a whole more efficient.
01:47:10.000 And third is entrepreneurship.
01:47:12.000 If you gave that power to more people to be able to kind of build businesses, then not only they're growing up with it, but also there's a culture of entrepreneurship.
01:47:25.000 And there is existing already in Saudi Arabia.
01:47:28.000 I mean, the sad thing about the Middle East, there's so much potential, but there's so much wars and so much disaster.
01:47:35.000 Well, there's so much money.
01:47:36.000 There's also so much money.
01:47:37.000 Yeah, which is good.
01:47:38.000 And I think it's good for the United States.
01:47:41.000 Like, I think what President Trump did with the deals in the Gulf region is great.
01:47:47.000 It's going to be great for the United States.
01:47:48.000 It's going to be great for the Gulf region.
01:47:53.000 But I think we need more of that, you know, we talked about a government.
01:47:57.000 We need more of that enlightened view of education, of change in our government today.
01:48:06.000 You know, this idea that we're going to bring back the old manufacturing jobs, I understand Americans got really screwed with what happened.
01:48:15.000 Like, you know, people got, these jobs got sent away by globalism, whatever you want to call it.
01:48:21.000 And a few number of people got massively rich.
01:48:23.000 A lot of people got disenfranchised.
01:48:27.000 And we had the opiate epidemic.
01:48:30.000 And it had just massive damage.
01:48:32.000 It made massive damage on the culture.
01:48:35.000 But is the way to bring back those jobs?
01:48:39.000 Or is there a new way of the future?
01:48:42.000 And there's probably a new manufacturing wave that's going to happen with robotics.
01:48:48.000 You know, the humanoid robots are starting to work.
01:48:54.000 And these, I think, will need a new way of manufacturing it.
01:48:59.000 And so the U.S. can be at the forefront of that, can own that, bring new jobs into existence.
01:49:05.000 And all of these things need software.
01:49:07.000 Like our world is going to be primarily run by AI and robots and all of that.
01:49:11.000 And more and more people need to be able to make software.
01:49:14.000 Even if it is prompting and not really, you know, but a lot more people just need to be able to make it.
01:49:19.000 There's going to be a need for more products and services and all of that stuff.
01:49:22.000 And I think there's enough jobs to go around if we have this mindset of let's actually think about the future of the economy as opposed to let's bring back certain manufacturing jobs, which I don't think Americans would want to do anyways.
01:49:38.000 Right.
01:49:39.000 They don't want to do the jobs.
01:49:40.000 My problem is there's some people that are doing those jobs right now and it's their entire identity.
01:49:46.000 You know, they have a good job, they work for a good company, they make a good living, and that might go away, and they're just not psychologically equipped to completely change their life.
01:49:57.000 What do you think is the solution there?
01:49:59.000 Which I agree, it's a real problem.
01:50:01.000 Well, desperation, unfortunately, is going to motivate people to make changes.
01:50:10.000 And it's going to also motivate some people to choose drugs.
01:50:16.000 My fear is that you're going to get a lot more people.
01:50:16.000 That's my fear.
01:50:18.000 There's going to be a lot of people that they figure it out and they survive.
01:50:24.000 I mean, this is natural selection, unfortunately.
01:50:27.000 Like, in applied to a digital World.
01:50:30.000 There's going to be people that just aren't psychologically equipped to recalibrate their life.
01:50:35.000 And that's my real fear.
01:50:37.000 My real fear is that there's a bunch of really good people out there that are, you know, valuable parts of a certain business right now that their identity is attached to being employee of the month.
01:50:49.000 They're good people.
01:50:50.000 They show up every day.
01:50:51.000 Everybody loves them and trusts them.
01:50:53.000 They do good work and everybody rewards them for that.
01:50:55.000 And that's part of who they are as a person.
01:50:57.000 They're a hardworking person.
01:50:58.000 Of course.
01:50:59.000 And they feel that way.
01:51:00.000 There's like a lot of real good people out there that are blue-collar, hard-working people.
01:51:06.000 And they take pride in that.
01:51:09.000 And that job's going to go away.
01:51:11.000 Well, I actually think that more white-collar jobs are going away.
01:51:15.000 I think so too.
01:51:16.000 So then blue-collar, which is what was the, like go back 10 years ago and we thought, okay, self-driving cars, you know, robots and manufacturing.
01:51:16.000 Yeah.
01:51:29.000 And that turned out to be a lot harder than actually like more desk jobs because we have a lot more data.
01:51:39.000 For one, we have a lot more data on people sitting in front of a computer and doing Excel and writing things on the internet.
01:51:49.000 And so we're able to train these what we call large language models.
01:51:53.000 And those are really good at using a computer like a human uses a computer.
01:51:59.000 And so I think the jobs to be worried about, especially in the next months to a year, a little more, is the routine computer jobs where it's formulaic.
01:52:11.000 You go, you have a task, like quality assurance jobs, right?
01:52:14.000 Software quality assurance.
01:52:18.000 You have to constantly test the same feature of some large software company, Microsoft or whatever.
01:52:24.000 You're sitting there and you're performing the same thing again and again and again every day.
01:52:33.000 And if there's a bug, you kind of report it back to the software engineers.
01:52:38.000 And that is, I think, really in the bullseye of what AI is going to be able to do over the next months.
01:52:47.000 And do it much more efficiently.
01:52:49.000 Much more efficiently, much faster.
01:52:51.000 Yeah.
01:52:52.000 Yeah, those people have to be really worried.
01:52:55.000 Drivers, you know, professional drivers, like people who drive trucks and things along those lines, that's going away.
01:53:02.000 That's definitely going away.
01:53:03.000 Yeah.
01:53:03.000 And that's an enormous part of our society.
01:53:06.000 It's millions of jobs.
01:53:08.000 Right.
01:53:09.000 You know, I was watching a video on this coal mining factory in China that's completely automated, and it was wild to watch.
01:53:17.000 Every step of the way is automated, including recharging the trucks.
01:53:22.000 Like the trucks know they're all electrical.
01:53:25.000 Everything's run on electricity.
01:53:26.000 They recharge themselves.
01:53:28.000 You know, they're pulling the coal out of the ground.
01:53:31.000 They're stacking it, inventory, everything.
01:53:34.000 Storage, it's all automated and it runs 24-7.
01:53:37.000 I'm like, this is wild.
01:53:38.000 This is crazy.
01:53:39.000 Yeah, I remember watching the video of BYD making an electric vehicle.
01:53:46.000 It was really satisfying to watch.
01:53:48.000 It's all like the entire assembly line is automated.
01:53:51.000 The way they put the paint and the way they do the entire thing is...
01:54:01.000 They're so advanced.
01:54:03.000 There's this guy that I follow on Instagram.
01:54:05.000 God, I can't remember his name.
01:54:07.000 I really wish I could right now, but he reviews a lot of electric vehicles, like very, like, I've never even heard of these companies.
01:54:15.000 And they're incredible.
01:54:17.000 They're so advanced.
01:54:19.000 And their suspension systems are so superior to the suspension systems of even like German luxury cars.
01:54:25.000 Like they did a demonstration where they drove one of these Chinese electric vehicles over an obstacle course.
01:54:33.000 And then they had like a BMW and a Mercedes go over.
01:54:35.000 And the BMW is all, work, work, work.
01:54:38.000 And the Chinese one is fucking flat planing the entire way.
01:54:42.000 Every bump in the road is being completely absorbed by the suspension.
01:54:46.000 Right.
01:54:46.000 It's all AI.
01:54:48.000 So much better than what we have.
01:54:50.000 Right.
01:54:50.000 Like, so much.
01:54:52.000 What is this?
01:54:53.000 That's him.
01:54:53.000 Yep.
01:54:54.000 That's him.
01:54:54.000 Forrest Jones.
01:54:55.000 Shout out to Forrest.
01:54:56.000 He's great.
01:54:57.000 He does like these really fast-paced videos, but he does a lot of cars that are available here in America as well.
01:55:04.000 But he does a shit ton of them that aren't.
01:55:06.000 Which one is this one here?
01:55:08.000 Neo.
01:55:09.000 Yeah, listen to him because he's pretty good at this shit.
01:55:11.000 710 horsepower.
01:55:12.000 I get cameras here, LIDAR there for self-driving, and this has two NEO-made chips.
01:55:17.000 And for reference, one of those chips is as powerful as four NVIDIA chips.
01:55:21.000 And this has two.
01:55:22.000 Neo also has battery swap stations, so if you're in a rush, you can hit one up.
01:55:25.000 It'll lift your car, swap out your battery, put in a fully charged one in between three and five minutes.
01:55:29.000 But here's where the S-Class should be worried.
01:55:30.000 Not only does this have rear steer and steer-byte wire, so it's extremely easy to maneuver, it may have one of the most advanced hydraulic systems I've ever seen.
01:55:36.000 It can pretty much counteract any bump.
01:55:38.000 After you go over something four times, it'll memorize it so that the fifth time, it's like that bump never existed.
01:55:42.000 Inside, you get pillows in your headrest, heated, ventilated, and massaging leather seats, a passenger screen built into my dash, a main screen that works super fast.
01:55:49.000 I get a driving display, a head-up display, and my steering works super fast.
01:55:55.000 Pretty dope.
01:55:56.000 What's interesting about the car is learning the terrain.
01:56:00.000 If it went over it once, it'll learn it.
01:56:03.000 And I think this is the next sort of big thing with AI, whether it's robotics, cars, or even chat GPT now, it has memory.
01:56:13.000 It learns about you and starts to like, sort of similar to how social media feeds, but I think in a lot of ways more negative, learn about you.
01:56:24.000 I think these systems will start to have more online learning.
01:56:29.000 Instead of just training them in these large data centers and these large data and then giving you this thing that doesn't know anything about it, it's totally stateless.
01:56:40.000 As you use these devices, they will learn your pattern, your behavior, and all that.
01:56:46.000 Yeah.
01:56:47.000 Why is China so much better at making these cars than us?
01:56:52.000 Because they're really advanced.
01:56:54.000 Yeah.
01:56:57.000 I think a lot of people think that I'm not an expert in China, but a lot of people think that the thing that makes China better and manufacturing is the sort of quote unquote like more like treating workers like slaves.
01:57:19.000 So slave work or whatever, which I'm sure some of that happens.
01:57:25.000 But Tim Cook recently said, maybe not so recent, but he thinks, you know, part of the reason why they manufacture in China is there's expertise there that developed over time.
01:57:36.000 Yeah, that's why they want to use the Chinese manufacturing for the iPhone 17.
01:57:41.000 Yeah, and I think the one of the things that are good at one of the things that are good about more technocratic systems, Singapore, obviously China's the biggest one, is that the sort of leadership, it comes at a cost of freedom and other things, but the leadership can have a 50-year view of where things are headed.
01:58:08.000 And they can say, while yes, we're now making the plastic crap, we don't want to keep making plastic crap.
01:58:17.000 We're going to build the capabilities and the automation and manufacturing expertise to be able to leapfrog the West in making these certain things.
01:58:29.000 Whereas it's been historically hard, again, for good reasons.
01:58:34.000 I think there's more freedom preserving when you don't have that much power in government.
01:58:42.000 But I feel like America, we're the worst of both worlds, where increasingly the government is making more and more decisions and choices than any state.
01:58:53.000 But at the same time, we don't have this enlightened, like, you know, 10-year roadmap for where we want to be.
01:59:02.000 Yeah, because we never think that way because we deal in terms.
01:59:06.000 Yeah, four-year terms.
01:59:07.000 Four-year terms.
01:59:08.000 That's the problem.
01:59:09.000 Also, public companies.
01:59:10.000 Four-year terms, public companies, quarters.
01:59:13.000 Quarters.
01:59:13.000 Right.
01:59:14.000 And again, this is back to this managerial idea run by managers that, you know, part of the reason why, you know, Zuck has complete control.
01:59:23.000 He can...
01:59:27.000 Like, I don't know, $30, $40 billion, maybe more per year, maybe?
01:59:32.000 He spent a ton of money, like a GDP worth, like a small state GDP worth of money on VR.
01:59:40.000 And the public market was totally doubtful of that.
01:59:43.000 And the reason he could do that is because he has, what are they called, super voting shares.
01:59:50.000 And so he has complete control of the company.
01:59:52.000 And he can't be unseated by activist investors, sort of what's been done to – Are they trying to remove him from that?
02:00:05.000 They can't unless...
02:00:08.000 I think there's a trial that's going on.
02:00:10.000 It was going on very recently.
02:00:12.000 Oh, I think you're thinking about the antitrust.
02:00:15.000 No, no, there's something about him saying that he can't be fired.
02:00:20.000 But it's true.
02:00:21.000 It is true.
02:00:21.000 It's legal.
02:00:22.000 It is nonsense.
02:00:22.000 I know.
02:00:24.000 I believe the trial is nonsense.
02:00:25.000 But a friend of mine was actually representing him in this.
02:00:29.000 Maybe in Europe or something?
02:00:31.000 I don't think so.
02:00:32.000 I think it's in America.
02:00:34.000 Google Mark Zuckerberg Josh Dubin trial.
02:00:38.000 See if you can find anything on that.
02:00:40.000 But yeah, Mark can think on the order of decades.
02:00:46.000 Like when I was there at Facebook, he was talking about the idea that there's going to be a fundamental shift.
02:00:54.000 He's like, if you look back 100 years, computers every 20 years or whatever change the user interface modality.
02:01:05.000 You go from terminals and mainframes to desktop computers to mobile computing.
02:01:11.000 And he was like, okay, what's next?
02:01:13.000 And first guess was like VR.
02:01:15.000 And now I think their best guess is like AR plus AI.
02:01:20.000 The AR glasses, their new meta Ray-Band glasses plus AI.
02:01:25.000 And they can make massive investment.
02:01:27.000 They just made crazy investment.
02:01:30.000 This company, Scale AI.
02:01:32.000 Scale AI is data provider for OpenAI and Google.
02:01:37.000 And what they do is OpenAI will say, I want the best law and legal data to train the best legal machine learning model.
02:01:48.000 And they'll go to places where the labor costs are low, but maybe still well educated.
02:01:56.000 There are places in Africa and Asia that are like that.
02:01:59.000 And they'll sit them down and say, okay, you're going to get these tasks, these legal programming, whatever tasks, and you're going to do them and you're going to write your thoughts as you're doing them.
02:02:08.000 I'm simplifying it, but basically that they collect all this data.
02:02:11.000 Basically, it's labeled labor.
02:02:14.000 They take it, they put it in the models, and they train the models.
02:02:18.000 And OpenAI spends billions of dollars on that, Anthropic, all these companies.
02:02:22.000 And so this company was the major data provider.
02:02:27.000 And Meta just acquired them.
02:02:31.000 There's this new trend of acquisitions, I assume because they want to get around regulations.
02:02:39.000 But they bought 49% of the company, and then they hired all the leadership.
02:02:48.000 So the Scale AI, like Meta, hired the leadership there and bought out the investors.
02:02:54.000 They put $15 billion into the company.
02:02:57.000 The weird thing about it is Google and OpenAI are like, we're not going to use this shit anymore.
02:03:03.000 So the company value went down because people, you know, these companies don't want to use it.
02:03:08.000 And now they're going to other companies.
02:03:10.000 And so in effect, Zuck bought a talent for $15 billion.
02:03:17.000 Wow.
02:03:18.000 Can you imagine that talent for $15 billion?
02:03:24.000 Google recently bought a company for one known researcher who's One of the inventors of the large language model technology, Noam Shazir, for $3 billion, bought his company.
02:03:36.000 And I think they're not really.
02:03:40.000 They do these weird deals where they buy out the investors and they let the company run as a shell of itself and then they acquire the talent.
02:03:47.000 Wow.
02:03:48.000 Microsoft did the same thing.
02:03:51.000 That's crazy.
02:03:52.000 So it's just these unique individuals that are very valuable there.
02:03:56.000 Very, very valuable worth billions of dollars.
02:03:58.000 Sam Altman says Metatroden failed to poach OpenAI's talent with $100 million offers.
02:04:03.000 So this $100 million is sign-on bonus.
02:04:10.000 This is not even salary.
02:04:13.000 Or yeah, equity.
02:04:14.000 It's just bonus.
02:04:15.000 It's just a bonus.
02:04:16.000 $100 million bonus.
02:04:17.000 Come here.
02:04:18.000 Failed.
02:04:19.000 And failed.
02:04:20.000 I don't know about failed.
02:04:21.000 I mean, I'm sure he's going to say that.
02:04:22.000 He worded it in a weird way.
02:04:23.000 He said, our best talent hasn't taken it.
02:04:25.000 So you could have gotten that.
02:04:25.000 Of course he's going to say that.
02:04:26.000 Of course.
02:04:27.000 The people that did take it.
02:04:28.000 Well, they weren't our best of us.
02:04:31.000 We don't even like those guys.
02:04:32.000 And by the way, OpenAI does it to companies like ours.
02:04:35.000 It's just a question of scale.
02:04:37.000 Like, Zuck can give them $100 million and steal the best talent.
02:04:41.000 And companies like OpenAI, which I love, but they go to small startups and give them $10 million to grab their talents.
02:04:51.000 But it's very, very competitive right now.
02:04:55.000 And there are, like, I don't know if these individuals are actually worth these billions of dollars, but the talent war is so crazy because everyone feels like there's a race towards getting to super intelligence.
02:05:10.000 And the first company to get to super intelligence is going to reap massive amounts of rewards.
02:05:15.000 How far away do you think we are from achieving that?
02:05:18.000 Well, you know, like I said, my philosophy tends to be different than I think the mainstream in Silicon Valley.
02:05:24.000 I think that AI is going to be extremely good at doing labor, extremely good at ChatGPT and being a personal assistant, extremely good at Replit being an automated programmer.
02:05:44.000 But the definition of super intelligence is that it is better than every other human collectively at any task.
02:05:57.000 And I am not sure there's evidence that we're headed there.
02:06:02.000 Again, I think that one important aspect of superintelligence or AGI is that you drop this entity into an environment where it has no idea about that environment.
02:06:16.000 It's never seen it before.
02:06:17.000 And it's able to efficiently learn to achieve goals within that environment.
02:06:22.000 Right now, there's a bunch of studies showing like, you know, GPT 4 or any of the latest models, if you give them an exam or a quiz that is slightly, even slightly different than their training data, they tank.
02:06:38.000 They do really badly on it.
02:06:41.000 I think the way that AI will continue to get better is via data.
02:06:46.000 Now, at some point, and maybe this is the point of takeoff, is that they can train themselves.
02:06:54.000 And the way we know how AI could train itself through a method called self-play.
02:07:02.000 So the way self-play works is, you know, take for example, AlphaGo.
02:07:07.000 AlphaGo is, I'm sure you remember Lisa Dole, a game between DeepMind, AlphaGo, and Lisa Dole, and it won in the game of Go.
02:07:17.000 The way AlphaGo is trained is that part of it is a neural network that's trained on existing data.
02:07:24.000 But the way it achieves superhuman performance in that one domain is by playing itself like millions, billions, perhaps trillions of times.
02:07:38.000 So it starts by generating random moves and then it learns what's the best moves.
02:07:43.000 And it's basically a multi-agent system where it learns, oh, I did this move wrong, and I need to kind of re-examine it.
02:07:49.000 And it trains itself really, really quickly by doing the self-play.
02:07:53.000 It'll play fast, fast games with itself.
02:07:58.000 But we know how to make this in game environments, because game environments are closed environments.
02:08:05.000 But we don't know how to do self-play, for example, on literature, because you need objective truth.
02:08:17.000 In literature, there's no objective truth.
02:08:19.000 Taste is different.
02:08:22.000 Conjecture, philosophy, there's a lot of things.
02:08:26.000 And again, I go back to why there's still a premise of humans, is there are a lot of things that are intangible.
02:08:36.000 And we don't know how to generate objective truth in order to train machines in the self-play fashion.
02:08:43.000 But programming has objective truth.
02:08:48.000 Coding has objective truth.
02:08:50.000 The machine can have, like, you can construct an environment that has a computer and has a problem.
02:08:56.000 There's a ton of problems.
02:08:57.000 And even an AI can generate sample problems.
02:09:01.000 And then there's a test to validate whether the program works or not.
02:09:06.000 And then you can generate all these programs, test them, and if they succeed, that's a reward that trains your system to get better at that.
02:09:17.000 If it doesn't succeed, that's also feedback.
02:09:20.000 And they run them all the time, and it gets better at programming.
02:09:23.000 So I'm confident programming is going to get a lot better.
02:09:26.000 I'm confident that math is going to get a lot better.
02:09:31.000 But from there, it is hard to imagine how all these other more subjective, softer sort of sciences of the AI will get better through self-play.
02:09:47.000 I think the AI will only be able to get better through data from human labor.
02:09:55.000 If AI analyzes all the past creativity, All the different works of literature, all the different music, all the different things that humans have created completely without AI.
02:10:09.000 Do you think it could understand the mechanisms involved in creativity and make a reasonable facsimile?
02:10:19.000 I think it will be able to imitate very well how humans come up with new ideas in a way that it remixes all the existing ideas from its training data.
02:10:36.000 But by the way, again, this is super powerful.
02:10:38.000 This is not like a dig at AI.
02:10:40.000 The ability to remix all the available data into new, potentially new ideas or newish ideas because they're remixes, they're derivative, is still very, very powerful.
02:10:52.000 But, you know, the best marketers, the best, like, you know, think of, you know, one of my favorite marketing videos is Think Different from Apple.
02:11:02.000 It's awesome.
02:11:03.000 Like, I don't think that really machines are at a point where they, like, I try to talk to ChatGPT a lot about like, you know, marketing or naming.
02:11:12.000 It's so bad at that.
02:11:13.000 It's like Midwit bad at that.
02:11:16.000 And I, you know, but that's the thing.
02:11:21.000 It's like, I just don't see, and look, I'm not an AI researcher and maybe they're working, they have ideas there.
02:11:28.000 But in the current landscape of the technology that we have today, it's hard to imagine how these AIs are going to get better at, say, literature or the softer things that we as humans find really compelling.
02:11:43.000 What's interesting is the thing that's the most at threat is these sort of middle-of-the-road Hollywood movies that are essentially doing exactly what you said about AI.
02:11:54.000 They're sort of like, you know, they're sort of remixing old themes and tropes and figuring out a way to repackage it.
02:12:06.000 But I think actually those tools in the hands of humans, they'll be able to create new interesting movies and things like that.
02:12:12.000 In the hands of humans.
02:12:12.000 Right.
02:12:14.000 So with additional human creativity applied.
02:12:17.000 So the man-machine symbiosis.
02:12:17.000 Right.
02:12:21.000 Right.
02:12:22.000 This was the term that's used by JC Lick Leider, like the grandfather of the internet from ARPA.
02:12:29.000 A lot of those guys kind of imagined a lot of what's going to happen, a lot of the future, and this idea of like human plus machine will be able to create amazing things.
02:12:39.000 So what people are making with VO is not because the machine is really good at painting it at like generating it and making it.
02:12:47.000 But it can't make it without the prompts.
02:12:49.000 Like the really funny, like, yeah, without the prompts, like the Bigfoot finds Trent and they inject themselves with Trent, they start working out.
02:13:06.000 I'm telling you, my TikTok feed is really wild right now.
02:13:12.000 It's like this real weird, distorted human mind to come up with this.
02:13:21.000 Have you seen the ones where it's Trump and Elon and Putin and they're all in a band?
02:13:27.000 They're playing Credence Clearwater Revival.
02:13:30.000 Fortunate Sun.
02:13:31.000 It's crazy.
02:13:32.000 Another one is the LA riots and how all the world leaders are sort of gangsters in the riots.
02:13:41.000 That one is hilarious.
02:13:43.000 Yeah, that kind of stuff is fun.
02:13:44.000 And it's interesting how quickly it can be made, too.
02:13:48.000 Something that would take a long time through these video editors where they were using computer generated imagery for a long time, but it was very painstaking and very, you know, very expensive.
02:14:01.000 Now it's really cheap.
02:14:02.000 On the way here, I was like, I want to make an app to sort of impress you with our technology.
02:14:07.000 I was like, what would Joe like?
02:14:09.000 And then I came up with this idea of like a squat form analyzer.
02:14:15.000 And so in the car over way here, sorry, in the lobby, but I made this app to...
02:14:25.000 On the way, on my phone.
02:14:30.000 And this is the really exciting thing about what we built with being able to program on your phone is being able to have that inspiration that can come anytime and just immediately pull out your phone and start building it.
02:14:48.000 So here, I'll show you.
02:14:56.000 So basically you just start recording and then do a few squats.
02:15:09.000 Okay.
02:15:12.000 It's going to analyze it just from there?
02:15:16.000 I mean the camera angle is not that great, but it's going to be able to tell you whether or not you're doing it well?
02:15:24.000 Yeah.
02:15:29.000 Those are not my best squats.
02:15:32.000 Just so you know, Joe Rogan.
02:15:33.000 I'm not judging you.
02:15:36.000 I used to squat, you know, 350 pounds.
02:15:42.000 So now it's integrating Google Gemini model to kind of run through the video, analyze it, and it'll come up with score and then suggestions.
02:15:54.000 And so again, this is like a random idea.
02:15:57.000 I was like, okay, what would be interesting to do?
02:16:00.000 This is a really interesting thing that people could use at the gym, though.
02:16:03.000 Like, not just for squats, but maybe for chin-ups and all kinds of stuff.
02:16:07.000 Like, oh, maybe, you know, I'm looking at your form, and this is what you need to do.
02:16:11.000 Get a little lower, you know, make your elbows parallel to your body, whatever.
02:16:16.000 I built so many personal apps.
02:16:18.000 Like, I built apps for analyzing my health.
02:16:21.000 Like, I talked about some of my health problems that are now a lot better.
02:16:24.000 Look, bad form.
02:16:28.000 Just like straight away, critical.
02:16:31.000 Yeah, knee's position, unable to probably assess from the video angle.
02:16:36.000 So, yeah, it's a little okay.
02:16:37.000 So, it's saying it's not the best angle.
02:16:39.000 But it's saying my depth is bad, which was actually bad.
02:16:44.000 So, and I was leaning forward.
02:16:46.000 But it's pretty good.
02:16:47.000 You know, I tried it a few times.
02:16:49.000 It's really good at that.
02:16:50.000 And so I build a lot of apps for just my personal life.
02:16:56.000 That would be great for a person who doesn't want a trainer.
02:16:59.000 Right.
02:17:00.000 You know, I don't want to deal with some person.
02:17:02.000 Let me just work out on my own.
02:17:03.000 But am I doing this right?
02:17:05.000 Set your phone up.
02:17:06.000 Have it correct you.
02:17:07.000 Yeah.
02:17:08.000 Yeah.
02:17:09.000 At the office, some guys are building, we have this partnership with Woop.
02:17:13.000 I don't know if you've building an app so we can start competing on workouts based on Woop data.
02:17:21.000 Oh, that's awesome.
02:17:24.000 Our company is very weird for Silicon Valley.
02:17:24.000 Yeah.
02:17:27.000 We have a jujitsu mat and we have...
02:17:30.000 Oh really?
02:17:33.000 Oh, that's fucking great.
02:17:34.000 That's awesome.
02:17:35.000 Talking hurt.
02:17:37.000 It's, you know, I only recently got into it, but the hardest thing about it is to be calm because your impulse is to overpower.
02:17:50.000 Yeah.
02:17:50.000 Yes.
02:17:52.000 The Gracies have a great saying, keep it playful.
02:17:55.000 Yeah.
02:17:56.000 And that's how you really learn the best.
02:17:58.000 It's very hard.
02:17:58.000 And listen, I'm a giant hypocrite because most of my jiu-jitsu career, I was a meathead.
02:18:05.000 That's one of the reasons why I started really lifting weights a lot.
02:18:08.000 It's like I realized strength is very valuable.
02:18:11.000 And it is.
02:18:12.000 And it is valuable.
02:18:13.000 But technique is the most valuable.
02:18:15.000 And the best way to acquire technique is to pretend that you don't have any strength.
02:18:19.000 The best way to acquire technique is to pretend to.
02:18:22.000 Yeah, don't force things.
02:18:24.000 Just find the best path.
02:18:26.000 And that requires a lot of data.
02:18:29.000 So you have to understand the positions.
02:18:32.000 So you have to really analyze them.
02:18:33.000 The best jiu-jitsu guys are really smart.
02:18:36.000 Like Mikey Musumichi, Gordon Ryan, Craig Jones.
02:18:40.000 Those are very intelligent people.
02:18:42.000 And that's why they're so good at jiu-jitsu.
02:18:44.000 And then you also have to apply that intelligence to recognize that discipline is a massive factor.
02:18:51.000 Like Mikey Musumichi trains every day, 12 hours a day.
02:18:55.000 12 hours a day?
02:18:56.000 12 hours a day.
02:18:57.000 Oh, yeah.
02:18:58.000 Is that humanly possible to do?
02:18:59.000 It's possible.
02:19:00.000 Yeah, because he's not training full blast.
02:19:02.000 It's not like, like, you can't squat 12 hours a day, 350 pounds.
02:19:06.000 Your body will break down.
02:19:08.000 But you can go over positions over and over and over and over again until they're in muscle memory, but you're not doing them at full strength, right?
02:19:15.000 So like if you're rolling, right?
02:19:18.000 So say if you're doing drills, you would set up like a guard pass.
02:19:23.000 You know, when you're doing a guard pass, you would tell the person, lightly resist, and I'm going to put light pressure on you.
02:19:30.000 And you go over that position, you know, knee shield, pass, you know, hip into it, here's the counter, on the counter, darse, you know, go for the darse.
02:19:43.000 The person defends the darse, roll, take the back.
02:19:46.000 And just do that over and over and over again.
02:19:48.000 Until it's muscle memory.
02:19:49.000 Right.
02:19:50.000 And it's like completely ingrained in your body.
02:19:53.000 Instead of chess players, it's like, let's focus on the endgame.
02:19:56.000 Just keep repeating the endgame end game.
02:19:56.000 Yeah.
02:19:58.000 I read the Josh Weitzken book.
02:20:02.000 What was it called?
02:20:03.000 I forgot.
02:20:04.000 You know, his book about, like, I think, chess and jiu-jitsu, was it?
02:20:08.000 Yeah, Josh was just in here a few months ago.
02:20:10.000 He's great.
02:20:10.000 Yeah.
02:20:11.000 But it's so interesting to see a super intelligent person apply that intelligence to jiu-jitsu.
02:20:18.000 You know, one of interesting things when I started getting into, I've always been into different kinds of sports and then periods of extreme programming and obesity.
02:20:34.000 But then I tried to get back into it.
02:20:36.000 I was a swimmer early on.
02:20:39.000 But one thing that I found, especially in the lifting communities, how intelligent everyone are.
02:20:45.000 They're actually almost like, you know, they're so focused, they're autistically focused on like form and program.
02:20:57.000 And, you know, they spend so much time designing these spreadsheets for your program.
02:21:03.000 Well, that's, people have this like really, we have this view of things physical, that physical things are not intelligent things, but you need intelligence in order to manage emotions.
02:21:20.000 Emotions are a critical aspect of anything physical.
02:21:24.000 Any really good athlete, you need a few factors.
02:21:28.000 You need discipline, hard work, genetics, but you need intelligence.
02:21:32.000 It might not be the same intelligence applied.
02:21:35.000 People also, they confuse intelligence with your ability to express yourself, your vocabulary, your history of reading.
02:21:49.000 That's like a bias almost.
02:21:50.000 Yes.
02:21:51.000 That's a language bias.
02:21:52.000 That's like the sort of modern desk job, the laptop class bias.
02:21:57.000 Well, they assume that anything that you're doing physically, you're now no longer using your mind.
02:22:02.000 But it's not true.
02:22:04.000 In order to be disciplined, you have to understand how to manage your mind.
02:22:07.000 Managing your mind is an intelligence.
02:22:10.000 And the ability to override those emotions, to conquer that inner bitch that comes to you every time I lift that fucking lid off of that cold plunge, that takes intelligence.
02:22:21.000 You have to understand that this temporary discomfort is worth it in the long run because I'm going to have an incredible result after this is over.
02:22:31.000 I'm going to feel so much better.
02:22:32.000 Right, right, right.
02:22:33.000 Yeah, I haven't thought about intelligence in order to manage your emotions, but that's totally true because you're constantly doing the self-talk.
02:22:40.000 You're trying to trick yourself into doing that.
02:22:42.000 There are people that are very intelligent that don't have control over their emotions.
02:22:46.000 But they're intelligent in some ways.
02:22:48.000 It's just they've missed this one aspect of intelligence, which is the management of the functions of the mind itself.
02:22:56.000 And they don't think that that's critical.
02:22:58.000 But it is critical.
02:22:58.000 It's critical to every aspect of your life.
02:23:01.000 And it'll actually improve all those other intellectual pursuits.
02:23:04.000 You know, to tie it back to the AI discussion, I think a lot of the sort of programmer researcher type is like they know that one form of intelligence and they over-rotate on that.
02:23:15.000 And that's why it was like, oh, we're so close to perfecting intelligence.
02:23:21.000 Because that's what you know.
02:23:22.000 But there's a lot of other forms of intelligence.
02:23:24.000 There's a lot of forms of intelligence.
02:23:26.000 And unfortunately, we're very narrow in our perceptions of these things and very biased.
02:23:34.000 And we think that our intelligence is the only intelligence.
02:23:37.000 And that this one thing that we concentrate on, this is the only thing that's important.
02:23:41.000 Right.
02:23:44.000 Have you read or done any CBT cognitive behavior therapy?
02:23:47.000 No.
02:23:49.000 Basically, CBT is like a way to get over depression and anxiety based on self-talk and cues.
02:24:00.000 I had to use it, again, I had like sleep issues.
02:24:02.000 I had to use CBTI, cognitive behavior therapy for insomnia.
02:24:08.000 And the idea behind it is to build up what's called sleep pressure.
02:24:17.000 So you don't, first of all, you insomnia is performance anxiety.
02:24:27.000 Once you have insomnia, you start having anxiety.
02:24:31.000 But by the time bedtime comes, you're like, oh my God, I'm just going to, you know, torn over in bed and I'm just going to be in bed.
02:24:38.000 And then you start associating your bedroom with the suffering of insomnia because you're sitting there and like, you know, all night and really suffering.
02:24:49.000 It's really horrific.
02:24:52.000 And first of all, you treat your bedroom as a sanctuary.
02:24:56.000 You're only there when you want to sleep.
02:24:58.000 So that's like one thing you program yourself to do.
02:25:03.000 And the other thing is you don't nap the entire day.
02:25:07.000 You don't nap at all, no matter what happens.
02:25:09.000 Like even if you're real sleepy, like get up and take a walk or whatever.
02:25:13.000 And then you build up what's called sleep pressure.
02:25:15.000 Like now you have like a lot of sleepiness.
02:25:19.000 So you go to bed, you try to fall asleep.
02:25:22.000 If you don't fall asleep within 15, 20 minutes, you get up, you go out, you do something else.
02:25:28.000 And then when you feel really tired again, you go back to bed.
02:25:31.000 Oh, God.
02:25:33.000 And then finally, once you fall asleep, if you wake up in the middle of the night, which is another sort of form of insomnia, instead of staying in bed, you get up, you go somewhere else, you go read or do whatever.
02:25:44.000 And slowly you program yourself to see your bed and, oh, like the bed is where I sleep.
02:25:50.000 It's only where I sleep.
02:25:51.000 I don't do anything else there.
02:25:53.000 And you can get over insomnia that way instead of using pills and all the other stuff.
02:25:58.000 Oh, the pills are the worst.
02:26:00.000 God, people that need those fucking things to sleep, I feel for them.
02:26:05.000 I can sleep like that.
02:26:06.000 That's amazing.
02:26:07.000 I can sleep on the bus.
02:26:08.000 That's a blessing.
02:26:08.000 That's a blessing.
02:26:09.000 That's a huge blessing.
02:26:10.000 My wife hates it.
02:26:11.000 It drives her nuts because sometimes she has insomnia.
02:26:13.000 I could sleep on rocks.
02:26:14.000 I could just go lay down on a dirt road and fall asleep.
02:26:18.000 Wow.
02:26:18.000 But I'm always going hard.
02:26:19.000 When you're always going hard, you're like, yeah, I don't take naps.
02:26:24.000 And I work out basically every day.
02:26:27.000 And so I'm always tired.
02:26:28.000 I'm always ready to go to sleep.
02:26:30.000 So do you fight it or do you just, it's not in you to take a nap?
02:26:34.000 I don't need a nap.
02:26:36.000 Yeah, I never need naps.
02:26:36.000 Yeah?
02:26:38.000 How many hours do you sleep?
02:26:39.000 I try to get eight.
02:26:42.000 No, last night I didn't get eight, but I got seven, six and a half.
02:26:42.000 Do you get eight?
02:26:47.000 Probably I got six and a half last night.
02:26:48.000 Yeah.
02:26:49.000 But that was because I got home and I started watching TV because I was a little freaked out about the war.
02:26:54.000 And so when I'm freaked out about the war, I like to fill my mind with nonsense.
02:26:59.000 Oh, okay.
02:27:00.000 Well, I just watch things that have nothing to do with the war.
02:27:03.000 Like, I play pool.
02:27:05.000 I'm pretty competitive.
02:27:06.000 I'm pretty good.
02:27:07.000 And so I like watching professional pool matches.
02:27:10.000 And there's a lot of them on YouTube.
02:27:11.000 So I just watch pool.
02:27:12.000 And I just watch, you know, patterns, how guys get out, stroke, how they use their stroke, like how different guys have different approaches to the game.
02:27:21.000 It's crazy, the type A people.
02:27:23.000 It's like for you, although pool is an escape, it suddenly becomes an obsession.
02:27:27.000 And you're like, I need to be the best at it.
02:27:29.000 I'm very obsessed.
02:27:30.000 So I totally quit video games, but then last year I was very stressed.
02:27:35.000 The company was doing really poorly before we sort of invented this agent technology.
02:27:40.000 And then also the Gaza genocide.
02:27:44.000 I was like watching these videos every night.
02:27:47.000 It was just really, really affecting me.
02:27:50.000 I can't watch that stuff at night.
02:27:53.000 At night is when I get my anxiety.
02:27:56.000 I mean, I don't generally have anxiety, not like a lot of people do.
02:28:01.000 I mean, when I say anxiety, I really feel for people that genuinely suffer from actual anxiety.
02:28:06.000 My anxiety is all sort of self-imposed.
02:28:08.000 And when I get online at night and I think about the world, my family's asleep, which is generally when I write in a, as long as I'm writing, I'm okay.
02:28:19.000 Comedy.
02:28:20.000 Yeah.
02:28:20.000 You know, I write like sort of an essay form, then I extract the comedy from it.
02:28:25.000 But when I get online and I just pay attention to the world, that's when I really freak out because it's all out of your control.
02:28:31.000 And it's just murderous psychopaths that are running the world.
02:28:35.000 And it just, at any moment, you could be, you know, in a place where they decide to attack.
02:28:44.000 And then you're a pawn in this fucking insane game that these people are playing in the world.
02:28:51.000 That's why I felt really frustrated with my family being there.
02:28:53.000 I was like, they have no say in it.
02:28:57.000 The war started, rockets are flying.
02:29:00.000 But anyways, I started playing a video game.
02:29:06.000 It's called Hades 2.
02:29:09.000 It's like an RPG video game.
02:29:13.000 And I was like, I'm trying to disconnect.
02:29:15.000 And then I started speedrunning that game.
02:29:17.000 Do you know what speedrunning is?
02:29:18.000 It's like you're trying to finish the game as fast as possible, as fast as human as possible.
02:29:18.000 No.
02:29:23.000 And I got down to like six minutes, and I was number 50 in the world.
02:29:28.000 Whoa.
02:29:29.000 But legitimately.
02:29:30.000 Oh, yeah, yeah.
02:29:31.000 My score is online to play for you.
02:29:35.000 That was crazy.
02:29:37.000 Why is he doing that?
02:29:38.000 That was crazy.
02:29:40.000 It's myth building, you know.
02:29:41.000 Yeah, weird.
02:29:44.000 But yeah, it is this thing about Taipei people.
02:29:47.000 Like you're just – even in your escapism becomes competitive and stressful.
02:29:54.000 Well, sort of, but it's also – I feel like pool is a discipline, just like archery.
02:30:01.000 I'm also obsessed with archery.
02:30:02.000 Archery is a discipline.
02:30:04.000 And I feel like the more divergent disciplines that you have in your life, the more you understand what it is about these things that makes you excel and get better at them.
02:30:14.000 And the more when I get better at those things, I get better at life.
02:30:20.000 I apply it to everything.
02:30:22.000 Yeah, this is another thing that AI now struggles with, which is called transform learning.
02:30:27.000 Learning something from domain, like learning something from math on how to do reasoning on math and being able to do reasoning on politics.
02:30:33.000 We just don't have evidence of that yet.
02:30:37.000 And I feel the same way.
02:30:39.000 Everything, like even powerlifting, when I got really into it, which is like the most unhealthy sport you can do.
02:30:46.000 You break your joints down.
02:30:47.000 Break your joints.
02:30:48.000 You look like shit because the more you eat, you can lift more.
02:30:52.000 You get fat.
02:30:53.000 They're all fat.
02:30:54.000 They don't go bottom of them.
02:30:55.000 Unless they're competing at a weight class.
02:30:58.000 Yeah.
02:31:00.000 And what is that repertoire?
02:31:02.000 Have you ever had him on?
02:31:04.000 The Jug of Milk?
02:31:06.000 Do you know Go Mad?
02:31:06.000 Go Mad.
02:31:07.000 No.
02:31:08.000 Gallon of Milk a Day.
02:31:09.000 Do you know that, Jeremy?
02:31:11.000 Do you know it?
02:31:12.000 Disgusting.
02:31:12.000 Disgusting.
02:31:13.000 Yeah, so basically.
02:31:14.000 Gallon of Milk a Day.
02:31:15.000 Yeah, so Mark Repertoire, he wrote this book called Starting Strength.
02:31:19.000 And it became like the main way most guys, at least my age, like getting into powerlifting.
02:31:25.000 It was about technique.
02:31:26.000 It was about his whole thing is like, look, everyone comes into lifting.
02:31:30.000 They think it's bodybuilding.
02:31:31.000 Powerlifting is nothing like that.
02:31:33.000 And he also looks like shit and he's fat.
02:31:35.000 But his technique is amazing.
02:31:39.000 And so the way he gets young guys to get really good and really strong, he puts them on a gallon of milk a day.
02:31:45.000 Does that really have a positive effect?
02:31:48.000 Yeah, I mean, he has a YouTube channel.
02:31:50.000 He has a lot of guys that are really, really strong.
02:31:52.000 And he's been a coach for a lot of people.
02:31:54.000 What is it about a gallon of milk a day?
02:31:55.000 Is it just the protein intake?
02:31:56.000 What is it?
02:31:57.000 A calories.
02:31:59.000 Okay, here it is.
02:31:59.000 Drink a gallon of milk a day, go mad, is undeniably the most effective nutritional strategy for adding slabs of mass to young underweight males.
02:32:07.000 Milk is relatively cheap, painless to prepare, and the macronutrient profile is very balanced, and calories are always easier to drink than eat.
02:32:15.000 Unfortunately, those interested in muscular hypertrophy rather, who are not young, underweight, and male, populations where GOMAT is not recommended, will need to put more effort into the battle to avoid excess fat accumulation.
02:32:31.000 Body composition can be manipulated progressively, much like barbell training to achieve the best results.
02:32:36.000 For example, the starting strength novice linear progression holds exercise selection frequency and volume variables constant.
02:32:44.000 Every 48 to 72 hours, the load stressor is incrementally increased to elicit an adaptation in strength.
02:32:50.000 If the load increase is too significant or insignificant, the desired adaptation won't take place.
02:32:56.000 Yeah, this is the intelligence.
02:32:57.000 This is intelligence.
02:32:58.000 This is the intelligence involved in lifting that people who are on the outside of it would dismiss.
02:33:04.000 Science, yeah.
02:33:05.000 Yes.
02:33:05.000 Yeah.
02:33:07.000 You know, I'm so honored to be the guy that introduces Joe Rogan to starting screen.
02:33:13.000 GoMad.
02:33:14.000 Roberto is so funny.
02:33:14.000 Yeah, GoMad.
02:33:16.000 You should watch some of his videos.
02:33:17.000 He has this very thick Texan accent, and he just, his audience shits on him all the time.
02:33:23.000 They call him fat and ugly and whatever.
02:33:26.000 And he abuses his audience, too.
02:33:29.000 So there it is.
02:33:31.000 Put his picture up.
02:33:32.000 That's the nerd.
02:33:34.000 That's an old photo.
02:33:35.000 He's not much fatter.
02:33:37.000 Yeah.
02:33:39.000 So he's just a nerd.
02:33:40.000 Yeah, he's a huge nerd.
02:33:42.000 But yeah, he used to lift a lot of weight.
02:33:45.000 Yeah, there's a lot.
02:33:46.000 That's what he used to look like.
02:33:47.000 That one photo with him with the hairy chest.
02:33:50.000 The black.
02:33:51.000 Oh, okay.
02:33:53.000 Wow.
02:33:54.000 Damn.
02:33:55.000 Is that him?
02:33:56.000 Is that him?
02:33:57.000 Really?
02:33:57.000 I don't think so.
02:33:58.000 It does look like him.
02:33:59.000 Yeah, that's him.
02:34:00.000 He used to be jacked.
02:34:01.000 Okay.
02:34:01.000 That's good.
02:34:02.000 Oh, so he was a bodybuilder at one point in time.
02:34:05.000 But then he got on that Go Matt shit.
02:34:07.000 And now he's a powerlifter.
02:34:09.000 Simply no other exercise, no machine provided level of muscular stimulation and growth than the correctly performed full squat.
02:34:17.000 Well, he's deadlifting in that image.
02:34:19.000 That's weird.
02:34:20.000 So he also makes you squat on every day of lifting.
02:34:25.000 Oh.
02:34:26.000 So squat every time, every time you lift.
02:34:28.000 Really?
02:34:29.000 Yeah, yeah.
02:34:30.000 Well, his idea is like squat is a full body exercise.
02:34:33.000 Like you can just go to the gym.
02:34:35.000 And when I used to be busy and I just want to maintain, like, be healthy, I'll just squat every time.
02:34:42.000 15, 20 minutes squat and just get out of the gym.
02:34:45.000 Yeah.
02:34:46.000 Well, I do something with legs every day.
02:34:48.000 Yeah.
02:34:49.000 You have to.
02:34:49.000 Yeah.
02:34:51.000 But squat, actually, it does feel like there's an upper body component to it as well.
02:34:55.000 Well, it's also your body recognizes like, oh, this asshole wants to lift really heavy things.
02:34:59.000 We've got to get big.
02:35:00.000 Right, exactly.
02:35:00.000 Yeah, it's the best way to get big.
02:35:02.000 Yeah.
02:35:02.000 Yeah, because your body just realizes, like, okay, we have to adapt.
02:35:06.000 This shithead wants to lift giant things every day.
02:35:11.000 Yeah.
02:35:11.000 Yeah, it's hilarious.
02:35:12.000 And, you know, the other one, I'm sure you know him.
02:35:15.000 I think you introduced me to him through your podcast, Louis Simmons.
02:35:19.000 Oh, yeah.
02:35:20.000 Those guys are crazy.
02:35:21.000 You watch the Netflix documentary?
02:35:23.000 I didn't watch the Netflix documentary, but we did actually interview him.
02:35:27.000 He's like one of the few people that I traveled to go meet who went to Westside Barbara.
02:35:31.000 I saw that.
02:35:31.000 It was great.
02:35:32.000 We have some of his equipment out here.
02:35:34.000 He has reverse hyper?
02:35:35.000 Yeah.
02:35:36.000 Reverse hyper is so good for people that have back problems.
02:35:39.000 Everyone that has a back issue, let me show you something.
02:35:42.000 And I bring them out to the reverse hyper machine, and I'm like, this thing will actively strengthen and decompress your spine.
02:35:48.000 Right, right.
02:35:48.000 It's so good.
02:35:49.000 It's so good for people that have like lower back issues where the doctor just wants to cut them.
02:35:53.000 I'm like, hold on, hold on.
02:35:56.000 Don't do that right away.
02:35:57.000 I had back pain since my late teens and the doctors want to, like, they did MRI and they found that there's a bit of a bulge and they want to do an operation on it.
02:36:14.000 Yeah, they want to do dysectomy.
02:36:15.000 Someone wanted to put me in antidepressants.
02:36:17.000 Apparently, you can manage pain with antidepressants.
02:36:19.000 Have you heard of that?
02:36:21.000 Yeah, apparently it's a thing.
02:36:25.000 And through listening to your podcast and others, I was like, I was just going to get strong.
02:36:29.000 So I got strong squats and things like that.
02:36:32.000 And the pain got a lot better.
02:36:34.000 It didn't go away entirely.
02:36:36.000 But the thing that really got me over the hump, and this one's crazy.
02:36:42.000 Are you familiar with the mind-body prescription?
02:36:45.000 No.
02:36:46.000 John Sarno?
02:36:47.000 Oh, okay, yes.
02:36:48.000 I heard about him on Howard Stern because he was talking about how a lot of back pain is psychosomatic.
02:36:52.000 Psychosomatic, yeah.
02:36:54.000 So his idea, and again, this is like Because a lot of back pain is real as fuck.
02:37:01.000 Right, right.
02:37:04.000 I think for me, it's always a combination of both.
02:37:07.000 Like, there's something physically happening.
02:37:09.000 But, like, his idea is that his idea is that your mind is creating the pain to distract you from emotional psychological pain.
02:37:21.000 I think that's the case in some people.
02:37:23.000 Yeah.
02:37:23.000 And then doctors will go do an image, and often they'll find something.
02:37:31.000 And he thinks that lumbar imperfections are almost in everyone.
02:37:37.000 Yes.
02:37:37.000 I think that's true.
02:37:38.000 And then the doctors latch on to that.
02:37:44.000 And your mind latches onto that.
02:37:47.000 And you start reinforcing, telling yourself that I have this thing, and the pain gets worse.
02:37:56.000 There's also another thing called the salience network.
02:37:58.000 Have you heard of this?
02:37:59.000 No.
02:38:00.000 If you can bring up the Wikipedia page for salience network, because I don't want to get it wrong, but the salience network is a network in the brain that neuroscientists found.
02:38:11.000 My doctor, Taddy Akiki, told me about this.
02:38:15.000 The salience network gets reinforced whenever you obsess over your pains or your health issues.
02:38:29.000 That makes sense.
02:38:30.000 So it's responsible for perception, and the more you reinforce, it's like a muscle.
02:38:36.000 The more you reinforce it, it's sort of like AI, you know, reinforcement learning, the more you reinforce it, it becomes more of an issue.
02:38:42.000 Including various functions, including social behavior, self-awareness, and integrating sensory, emotional, and cognitive information.
02:38:48.000 Boy, I bet social media is really bad for that.
02:38:51.000 Right, totally.
02:38:52.000 Yeah.
02:38:52.000 Yeah.
02:38:53.000 Right.
02:38:54.000 And so a lot of the fatigue and things like that, at some point, I'm like, fuck it.
02:39:01.000 I did a lot of other things, but at some point, I'm like, fuck it.
02:39:04.000 I don't care about it.
02:39:05.000 I don't have it.
02:39:08.000 I'm just going to be good.
02:39:10.000 Just not concentrate on that.
02:39:11.000 Yeah, because I was reading about it all the time.
02:39:13.000 I was really worried.
02:39:16.000 I had an Abigail Schreier and I was talking about that in regards to cognitive therapy, that there's a lot of people that obsess on their problems so much that their problems actually become bigger.
02:39:26.000 And this is it.
02:39:26.000 Yes.
02:39:27.000 This is the neuroscience behind it, the salience network.
02:39:31.000 Makes sense.
02:39:32.000 Yeah.
02:39:33.000 But there's legit back problems.
02:39:35.000 Of course.
02:39:36.000 Legit back.
02:39:36.000 That's why the John Sarno thing, I was like, okay, not for me.
02:39:40.000 I understand how some people could develop that issue.
02:39:44.000 But his insight was, look, look, I ran a clinic in New York City for a long time, and these chronic illnesses come in waves.
02:39:54.000 There's a ulcers wave in the like 90s.
02:39:58.000 Oh, because it became a thing that people were talking about a lot.
02:40:01.000 Yes.
02:40:02.000 And then there's like a neck pain, and then there's an RSI.
02:40:02.000 Wow.
02:40:06.000 The most recent one was RSI.
02:40:08.000 What is RSI?
02:40:09.000 Repetitive strain injury.
02:40:11.000 And again, all these things have rational explanations.
02:40:18.000 For me, I was in the computer all the time.
02:40:21.000 And I was like, oh, my arm hurts.
02:40:25.000 And yeah, maybe there was some aspect of it.
02:40:29.000 I was programming a lot.
02:40:31.000 But also after I read John Sarno and I realized that some of it might be also psychological, that it's stress.
02:40:39.000 I don't know what's, maybe I have some childhood issues, but you just realize that a lot of it, and maybe the other way is true as well.
02:40:49.000 When you just minimize it, it just becomes less of an issue in your mind.
02:40:54.000 But the fact that it is fashion should tell you that there's something psychosomatic about it.
02:41:00.000 Right.
02:41:00.000 The fact that it does come in waves like that, for sure.
02:41:03.000 And then once it's in the zeitgeist, ulcers or whatever it is.
02:41:08.000 I remember when we were kids, everyone had ulcers.
02:41:11.000 And it was like, oh, it's from coffee in the morning.
02:41:15.000 And like, there's all these.
02:41:16.000 I don't know anyone that had ulcers now.
02:41:18.000 Right.
02:41:18.000 They don't either.
02:41:19.000 That's true.
02:41:21.000 That's crazy.
02:41:22.000 This is wild the mind, like the way it can benefit you or the way it can hold you prisoner.
02:41:22.000 That's wild.
02:41:29.000 Yeah.
02:41:30.000 And again, this is maybe why I have like a little like different view about AI and humans and all of that from from Silicon Valley.
02:41:40.000 Like this is a weird thing, but every time I set my mind to like meet someone, I meet them, including you.
02:41:51.000 Oh, wow.
02:41:52.000 That's weird.
02:41:53.000 Like, yeah, I want to meet this person.
02:41:55.000 Something happens, some chain of events, but obviously you also see it in some way.
02:42:01.000 But it's obviously you're doing something very – you're – Which is my problem with the secret and the power of manifesting things.
02:42:12.000 I don't go that far, but I don't know.
02:42:15.000 There's something.
02:42:16.000 There's something there.
02:42:17.000 I agree.
02:42:18.000 There's something to it.
02:42:22.000 I think the mind and our connection to reality is not as simple as we've been told.
02:42:30.000 Not at all.
02:42:31.000 I think there's something there.
02:42:33.000 And again, when you start looking at psychedelics and stuff like that, there's something there.
02:42:38.000 And I remember listening to one of, I love JRE circa early 2010s.
02:42:47.000 There was a remote viewing.
02:42:50.000 You were talking about a remote viewing episode.
02:42:53.000 And I was like, wow, that's crazy.
02:42:56.000 And obviously very skeptical of it.
02:42:58.000 The idea that you can meditate and like see somewhere else or see it from above.
02:43:04.000 I read a book about Da Vinci.
02:43:07.000 It's called Da Vinci's Brain, I think.
02:43:11.000 And Da Vinci is like fascinating.
02:43:13.000 Who's this fucking guy?
02:43:15.000 He does everything.
02:43:16.000 And he literally is across all these domains.
02:43:21.000 And he barely sleeps.
02:43:23.000 He has this polyphasic sleep thing, which I tried once.
02:43:26.000 It's torture.
02:43:28.000 Basically, every four hours you sleep for 15 minutes.
02:43:34.000 When I was in university, I was very good at computer science, but I hated going to school.
02:43:41.000 And in Jordan, if you don't go to school, they ban you from the exam.
02:43:45.000 Oh, wow.
02:43:46.000 I was getting A's, but I just didn't want to sit in class.
02:43:51.000 And actually, this is when I started thinking about programming on my phone.
02:43:54.000 I was like, maybe I can code my phone in class.
02:43:58.000 But I felt there was injustice.
02:44:01.000 ADHD, whatever you want to call it.
02:44:02.000 I just can't sit in class.
02:44:04.000 Just give me a break.
02:44:05.000 And so I felt justified to rebel or fix the situation somehow.
02:44:11.000 So I decided to hack into the university and change my grades so I can graduate because everyone was graduating.
02:44:18.000 It was like five years in.
02:44:19.000 It took me six years to get through a four-year program just because I can't sit in class and have some dyslexia and things like that.
02:44:30.000 So I decided to do that.
02:44:34.000 And I'm like, okay, hacking takes a lot of time because you're coding, you're scripting, you're running scripts against servers and you're waiting.
02:44:43.000 And I'm like, I'm just going to, to optimize my time, I'm just going to do this DaVinci thing where four hours, by the way, there's a Seinfeld episode where, what's his name, The Crazy Guy in Seinfeld?
02:44:56.000 Kramer?
02:44:57.000 Kramer does polyphasic sleep.
02:44:57.000 Kramer.
02:45:00.000 Maybe I learned it from there.
02:45:01.000 I'm not sure.
02:45:02.000 How do you wake up?
02:45:04.000 You set an alarm.
02:45:05.000 Oh, God.
02:45:06.000 Yeah, it's torture.
02:45:07.000 That sounds so crazy.
02:45:09.000 Apparently, Da Vinci used to do that.
02:45:10.000 But anyways, I was able to hack into the university by working for weeks using polyphasic sleep and was able to change my grades.
02:45:21.000 And initially, I didn't want to do it on myself, but I had a neighbor who went together to school and I was like, let's change this grade and see if it actually succeeds.
02:45:35.000 And actually succeeded in his case.
02:45:37.000 And it was my lab rat.
02:45:38.000 But in my case, I got caught.
02:45:42.000 And the reason I got caught is there is in the database, there's your grade out of 100, 0 to 100.
02:45:54.000 When you get banned because of attendance, your grade is de facto 35.
02:46:00.000 So I thought I would just change that, and that's the thing that will get me to pass.
02:46:05.000 Well, it turns out there's another field in the database about whether you're banned or not.
02:46:11.000 This is bad coding, this is bad programming, because this database is not normalized.
02:46:15.000 There's a state in two different fields.
02:46:18.000 So I'll put the blame on them for not designing the right database.
02:46:23.000 That's hilarious.
02:46:25.000 You blame them for your hacking being successful.
02:46:27.000 So what was the punishment?
02:46:29.000 So the entire university system went down because there's this anomaly.
02:46:33.000 I was, you know, I passed, but at the same time, I was banned.
02:46:39.000 And so I got a call from the head of the registration system.
02:46:46.000 And it was like 7 p.m., whatever.
02:46:48.000 It was landline.
02:46:49.000 And I picked up the call.
02:46:50.000 He's like, hey, listen, we have this issue we're dealing with.
02:46:52.000 Like, the entire thing is down.
02:46:54.000 And it just shows your record.
02:46:56.000 There's a problem with it.
02:46:57.000 Do you know anything about it?
02:46:59.000 And at the time, I'm like, all right, there's like a fork in the road.
02:47:02.000 You know, I either like come clean or just like, this is a lie that will like live for me forever.
02:47:08.000 And I'm like, I was just going to say, I was like, yeah, I did it.
02:47:12.000 And I was like, what do you mean?
02:47:13.000 I was like, okay, I'll come explain it to you.
02:47:15.000 So the next day, I go there, and it's all the university deans.
02:47:19.000 And it's like one of the best computer science universities in the region, the princess of my university for technology.
02:47:25.000 And they're all nerds.
02:47:27.000 So the discussion became technical on how I hacked in the university.
02:47:32.000 And I want the whiteboard and explaining what I did with the assistance and whatever.
02:47:37.000 And it just felt like a brainstorming session.
02:47:38.000 I'm like, all right, I'll see you guys later.
02:47:40.000 It's like, wait, we need to figure out what to do with you.
02:47:44.000 Like, you know, this is serious.
02:47:46.000 And I'm like, oh, crap.
02:47:49.000 But the president of, they kind of put the decision to the president.
02:47:54.000 And he was, I forgot his name, but he was such an enlightened guy.
02:47:59.000 And I went and told him, like, I just didn't mean any malice.
02:48:02.000 I just felt like justified.
02:48:04.000 I need to graduate.
02:48:05.000 I've been here for a long time.
02:48:06.000 I actually do good work.
02:48:09.000 And he's like, look, you're talented, but with great power comes great responsibility.
02:48:14.000 He gave me the Spider-Man.
02:48:17.000 And he said, for us to forgive you, you're going to have to go and harden the systems in the university against hacking.
02:48:26.000 So I spent the summer trying to work with the engineers at the university to do that.
02:48:32.000 But they hated me because I'm the guy that hacked into the system.
02:48:36.000 So they would blackball me.
02:48:39.000 Sometimes I'll show up to work and they wouldn't open the door and I can see them.
02:48:42.000 Like, I can see you there.
02:48:43.000 I'm knocking.
02:48:44.000 And they wouldn't let me in and let me work with them.
02:48:48.000 We did some stuff to fix it.
02:48:50.000 And then I gained fame, maybe notoriety in the university.
02:48:57.000 And actually got me my first job While I was in school.
02:49:03.000 And it's a different story, but that job was at a startup that ended up making videos that were a big part of the Arab Spring.
02:49:13.000 Yeah.
02:49:13.000 Oh, wow.
02:49:14.000 And I was part of some of these videos as well.
02:49:17.000 But, anyways, so one of the computer deans was like, hey, listen, I really helped you out, computer science dean.
02:49:25.000 I really helped you out when you had this problem.
02:49:27.000 And I need you to work with me in order to do another research to hack into the university again.
02:49:32.000 I was like, I'm not going to do that.
02:49:36.000 No, it's like, no, you're not going to get in trouble.
02:49:39.000 It's sanctioned.
02:49:40.000 It's going to be sanctioned.
02:49:42.000 So, again, I worked tirelessly on that.
02:49:44.000 This time, I invent a piece of software to help me do that.
02:49:50.000 And I was able to find more vulnerabilities.
02:49:56.000 And so I show up at my project defense.
02:49:58.000 And it's like a committee of different deans and students and all of that.
02:50:04.000 And so I go up and I start explaining my project.
02:50:09.000 And I run a scan against the university network.
02:50:12.000 And it showed a bunch of red, like there's vulnerabilities.
02:50:15.000 And one of the deans is like, no, that's fake.
02:50:19.000 That's not true.
02:50:20.000 It started dawning in me that I was like a pawn in some kind of power struggle.
02:50:26.000 So that guy was responsible for the university system.
02:50:29.000 And this guy is using me too.
02:50:34.000 I was like, oh, shit.
02:50:35.000 But like, I'm not going to back down.
02:50:36.000 I was like, no, that's not a lie.
02:50:37.000 It's true.
02:50:38.000 And so I tap into that vulnerability and I go into the database and I'm like, all right, what do you want me to show?
02:50:47.000 Your salary or your password?
02:50:50.000 It was like, show me a password.
02:50:51.000 So I show him the password and I was like, no, that's not my password.
02:50:57.000 It was encrypted.
02:50:58.000 But they also have in the database like a decrypt function, which they shouldn't have, but they had it.
02:51:03.000 So I was like, decrypt to the password.
02:51:05.000 And the password showed on the screen in the middle of the defense.
02:51:08.000 And so his face was red.
02:51:11.000 He shakes my hand and he leaves to change his password.
02:51:18.000 That's awesome.
02:51:19.000 And I graduated.
02:51:20.000 And they caught me some slack and I was able to graduate.
02:51:24.000 That's awesome.
02:51:25.000 That's a great story.
02:51:26.000 We'll end with that.
02:51:28.000 Thank you very much, brother.
02:51:29.000 I really appreciate it.
02:51:29.000 It was really fun.
02:51:30.000 That was a great conversation.
02:51:32.000 Thank you.
02:51:33.000 Your app, let everybody know about it.
02:51:34.000 Replit, R-E-P-L-I-T.
02:51:36.000 How to find it?
02:51:37.000 There it is.
02:51:38.000 Replit.
02:51:39.000 Replit.com.
02:51:40.000 Go make some apps.
02:51:41.000 Go make some apps, people.
02:51:43.000 Avoid the whatever the hell is going to happen today.
02:51:47.000 All right.
02:51:47.000 Thank you very much.
02:51:48.000 Thank you.