The Joe Rogan Experience - January 31, 2023


Joe Rogan Experience #1934 - Lex Fridman


Episode Stats

Length

3 hours and 10 minutes

Words per Minute

179.21233

Word Count

34,128

Sentence Count

3,087

Misogynist Sentences

43

Hate Speech Sentences

29


Summary

ChatGPT is a neural network that's been around for a few years, and it's been making waves in the AI world. In this episode of The Joe Rogan Experience, we talk about how ChatGPT came to be, how it got started, and why it's one of the smartest neural networks out there. We also talk about some of the problems it's solving, and what it's doing, and how it might be able to solve them. This episode is sponsored by Train By Day, a company that makes custom T-shirts and hoodies, and by Night, a coffee shop that makes good coffee. Check it out! Joe Rogans Experience is a podcast by day, all day, about AI and machine learning, by night, all night. See all the links below to the books I mentioned in this episode. If you like the show, please consider becoming a patron patron and/or become a patron supporter. Thank you so much for your support, it means the world to me and my day to day job and I get to do what I love to do my best to help create the best podcast possible for people who need it the most they can. I hope you enjoy it! Joe's new book is out soon, and if you like it, please give it a review on Amazon Prime and review it on amazon.com/joejr/thejosecrane and tell a friend about it :) I'm looking forward to hearing from you! Timestamps: 1: 2:00: 3:00 - What's a good coffee? 4:30 - What are you like? 5:00 6:00 | What's your favorite coffee shop? 7:30 | What do you like about it? 8:40 - How do you think it's good? 9: Is it smart? 11:15 - What kind of coffee you're drinking? 12:30 13: What s a good day? 15:00 -- what do you need? 16:30 -- what are you looking for? 17:40 -- what s your favorite thing? 18:00 // 15:30 Is it a good enough? 19:00 Does it make you want to learn something new? 21:00 Is it better? 22:00 Can you have a cup of coffee with me?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:05.000 Train by day, Joe Rogan Podcast by night, all day!
00:00:12.000 What's good, brother?
00:00:12.000 How are you?
00:00:13.000 Good to see you, my friend.
00:00:14.000 Good to see you.
00:00:14.000 Hey, what have your people done?
00:00:16.000 Your AI people with this fucking chat GPT shit.
00:00:19.000 This scares the fuck out of me.
00:00:21.000 It's your people.
00:00:21.000 What do you mean, your people?
00:00:23.000 Your AI people.
00:00:24.000 You're wacky coders.
00:00:25.000 What have you done?
00:00:26.000 Yeah, it's super interesting.
00:00:28.000 Fascinating.
00:00:29.000 Language models, I don't know if you know what those are, but that's the general systems that underlie ChatGPT and GPT. They've been progressing over the past maybe four years aggressively.
00:00:41.000 There's been a lot of development.
00:00:43.000 GPT-1, GPT-2, GPT-3, GPT-3.5.
00:00:48.000 And ChatGPT, there's a lot of interesting technical stuff that Maybe we don't want to get into it.
00:00:53.000 Sure, let's get into it.
00:00:54.000 I'm fascinated by it.
00:00:56.000 So, ChatGPT is based on, fundamentally, on a 175 billion parameter neural network that is GPT-3.
00:01:06.000 And the rest is what data is it trained on and how is it trained.
00:01:10.000 So you already have like a brain, a giant neural network, and it's just trained in different ways.
00:01:15.000 So Chad, GPT-3 came out about two years ago and it was like impressive but dumb in a lot of ways.
00:01:23.000 It was like you would expect as a human being for it to generate certain kinds of text and it was like saying kind of dumb things that were off.
00:01:31.000 You know, like alright, this is really impressive but it's not quite there.
00:01:34.000 You can tell it's not intelligent.
00:01:36.000 And what they did with GPT 3.5 is they started adding more and different kinds of datasets there.
00:01:45.000 One of them, probably the smartest neural network currently, is Codex, which is fine-tuned for programming.
00:01:53.000 It was trained on code, on programming code.
00:01:57.000 And when you train on programming code, which ChatGPT is also, you're teaching it something like reasoning.
00:02:03.000 Because it's no longer information and knowledge from the internet, it's also reasoning.
00:02:09.000 You can, like, logic.
00:02:10.000 Even though you're looking at code, programming code is, you're looking at me like, what the fuck is he talking about?
00:02:15.000 No, no, no, that's not what I'm looking at.
00:02:16.000 I'm looking at you like, oh my god.
00:02:18.000 But reasoning is, in order to be able to stitch together sentences that make sense, you not only need to know the facts that underlie those sentences, you also have to be able to reason.
00:02:28.000 And we take it for granted as human beings that we can do some common sense reasoning.
00:02:34.000 Like, this war started at this date and ended at this date, therefore it means that...
00:02:40.000 The start and the end has a meaning.
00:02:43.000 There's a temporal consistency.
00:02:45.000 There's a cause and effect.
00:02:46.000 All of those things are inside programming code.
00:02:49.000 By the way, a lot of stuff I'm saying we still don't understand.
00:02:51.000 We're like intuiting why this works so well.
00:02:55.000 Really?
00:02:55.000 These are the intuitions.
00:02:56.000 Yeah, there's a lot of stuff that's not clear.
00:03:00.000 So GPT 3.5, which chat GPT is likely based on.
00:03:05.000 There's no paper yet, so we don't know exactly the details.
00:03:07.000 But it was just trained on code and more data that's able to give it some reasoning.
00:03:15.000 Then, this is really important, it was fine-tuned in a supervised way by human labeling.
00:03:21.000 Small dataset by human labeling of here's what we would like this network to generate.
00:03:28.000 Here's the stuff that makes sense.
00:03:29.000 Here's the kind of dialogue that makes sense.
00:03:31.000 Here's the kind of answers to questions that make sense.
00:03:35.000 It's basically pointing this giant titanic of a neural network into the right direction that aligns with the way human beings think and talk.
00:03:43.000 So it's not just using the giant wisdom of Wikipedia.
00:03:48.000 I can talk about what data sets it's trained on, but just basically the internet.
00:03:52.000 It was pointed in the wrong direction.
00:03:55.000 Supervised labeling allows it to point in the right direction to when it says shit, you're like, holy shit, that's pretty smart.
00:04:02.000 So that's the alignment.
00:04:03.000 And then they did something really interesting is using reinforcement learning based on labeling data from humans.
00:04:12.000 That's quite a large data set.
00:04:14.000 The task is the following.
00:04:16.000 You have this smart GPT 3.5 thing, generate a bunch of text, and humans label which one seems the best.
00:04:24.000 So ranking.
00:04:26.000 Like you ask it a question.
00:04:28.000 For example, you could generate a joke in the style of Joe Rogan, right?
00:04:32.000 And you have a label.
00:04:34.000 There's five options.
00:04:35.000 And you have a label.
00:04:37.000 Does it mention dick and pussy?
00:04:39.000 I don't know how exactly, but you get it to rank.
00:04:45.000 The human label is just sitting there.
00:04:48.000 There's a very large number of them.
00:04:50.000 They're working full-time.
00:04:51.000 They're labeling the ranking of the outputs of this model.
00:04:54.000 And that kind of ranking used together with a technique called reinforcement learning is able to get this thing to generate very impressive to humans output.
00:05:04.000 So it's not actually, there's not a significant breakthrough in how much knowledge was learned.
00:05:08.000 That was already in GPT-3 and there was much more impressive models already trained.
00:05:13.000 So it's on the way, not just OpenAI.
00:05:15.000 But this kind of fine-tuning, it's called, by human labelers plus reinforcement learning, you start to get like where students don't have to write essays anymore in high school, where you can style transfer.
00:05:31.000 Like I said, do a Louis C.K. joke in the style of Joe Rogan or Joe Rogan joke in the style of Louis C.K., It does an incredible job at those kinds of style transfers.
00:05:42.000 You can more accurately query things about the different historical events, all that kind of stuff.
00:05:48.000 Holy shit, man.
00:05:50.000 The idea that you don't exactly know why it works the way it works.
00:05:55.000 That's too close to human.
00:05:57.000 That's too close to human thinking.
00:05:59.000 You know what this is eerily similar to?
00:06:02.000 The plot of Ex Machina.
00:06:05.000 When he's talking about how he coded the brain.
00:06:08.000 Do you remember that plot?
00:06:10.000 That scene?
00:06:11.000 That scene when he was, yeah, no.
00:06:13.000 The gentleman, who's the, what's the gentleman's name?
00:06:15.000 The actor, that dude's badass.
00:06:17.000 Really good, really good actor.
00:06:18.000 Oscar Isaac?
00:06:19.000 Yeah.
00:06:20.000 Isaac?
00:06:21.000 Great casting.
00:06:22.000 He's amazing.
00:06:23.000 Alex Garland, the director.
00:06:24.000 Oscar Isaac.
00:06:25.000 Somebody I've gotten to know.
00:06:26.000 Oscar Isaac.
00:06:26.000 He's in Star Wars and shit, too.
00:06:27.000 Yeah, no, that movie was one of, it's one of my top tens.
00:06:31.000 I love that movie.
00:06:32.000 But that scene where he's, below John Wick 1, 2, and 3?
00:06:37.000 Well, three of us, I'm not a fan of three.
00:06:39.000 Three didn't have any muscle cars.
00:06:41.000 Still worse than Sent to a Woman.
00:06:42.000 Go on.
00:06:44.000 It's worse than Sent to a Woman.
00:06:46.000 Which one?
00:06:46.000 John Wick 3 or 1?
00:06:48.000 All of them.
00:06:48.000 How dare you?
00:06:49.000 All of them.
00:06:50.000 It's silly man movies.
00:06:52.000 Yeah, you ever watch them when you're on the treadmill, though?
00:06:54.000 No, I don't.
00:06:55.000 Gives you motivation.
00:06:56.000 Yeah.
00:06:57.000 It's constant action.
00:07:00.000 You ever watch them a hundred times?
00:07:02.000 Which apparently you have.
00:07:03.000 Well, I was trying to win a bet.
00:07:06.000 Rocky is better, I think, for that.
00:07:08.000 Really?
00:07:09.000 I'm a sucker for Rocky.
00:07:10.000 The whole soundtrack.
00:07:13.000 I can't get over the bad fight scenes.
00:07:16.000 Oh, the bad fight scenes.
00:07:17.000 I can't.
00:07:18.000 My disconnect, it won't allow that.
00:07:20.000 Have you seen the montagers recently?
00:07:22.000 No.
00:07:23.000 They're cheesy as hell and they still work.
00:07:26.000 Because he's doing the kind of fitness he's doing.
00:07:28.000 He's doing, like, pull-ups and, like, he's doing the silliest of stuff, even Drago.
00:07:32.000 It's silly.
00:07:33.000 Anyway.
00:07:33.000 It's just...
00:07:34.000 There's so much corny to the actual physical confrontations.
00:07:40.000 Sure.
00:07:40.000 Like, as an analyst.
00:07:42.000 You know, I'm like, come on.
00:07:44.000 It doesn't work like this.
00:07:45.000 Which is the interesting things about Ex Machina, for me, as somebody who knows about AI and robotics, it doesn't, the corny signal doesn't.
00:07:54.000 What is this?
00:07:55.000 So this is the one where he's in Russia?
00:07:57.000 Doing the old school training?
00:07:58.000 Running in the snow.
00:08:00.000 Jogging in the snow.
00:08:00.000 And that's supposed to be badass.
00:08:02.000 And then the other dude, Drago, was using machines and computers and shit.
00:08:06.000 I put these computers in.
00:08:07.000 That's where I found out about the VersaClimber.
00:08:10.000 I'm like, that's got to be the most fucking high-tech way to work out ever.
00:08:13.000 We have one of those.
00:08:14.000 They're the shit.
00:08:15.000 You ever used one?
00:08:16.000 No.
00:08:17.000 It's brutal.
00:08:18.000 They're hard.
00:08:19.000 But that movie's dumb as shit.
00:08:20.000 Fuck out of here.
00:08:21.000 Oh, look.
00:08:22.000 I like, though, cold exposure, doing a little crawling, pulling sleds.
00:08:25.000 All good.
00:08:27.000 See how they mimic each other?
00:08:30.000 One of them's old school.
00:08:31.000 Old school's always better.
00:08:33.000 Yeah, you don't want computers, technology and shit.
00:08:36.000 You want to do it with a log out there in the fucking snow doing press-ups.
00:08:40.000 Yeah.
00:08:41.000 But technology can mimic that.
00:08:43.000 Can mimic the romance of nature and humanness.
00:08:47.000 That's the whole point.
00:08:47.000 100%.
00:08:48.000 That's what Vex Machina is doing, right?
00:08:50.000 Yes, that's what's scary.
00:08:51.000 And then in this, well, that scene where she gets him to fall in love with her, it's so creepy when she comes back with clothes on and she's got a wig and you're like, oh my god.
00:09:04.000 Like, it's so subtle.
00:09:05.000 Like, it's so well done.
00:09:07.000 The scene is so well done.
00:09:08.000 But that's what Chad GPD's doing.
00:09:11.000 They're real close.
00:09:13.000 Duncan sent me a series.
00:09:16.000 Of course he did.
00:09:16.000 Of course he did.
00:09:17.000 Duncan's, he's using it right now.
00:09:18.000 While we're talking, I'm sure Duncan's on it.
00:09:20.000 But he sent me this series of jokes that were done.
00:09:24.000 One, me talking about aliens.
00:09:27.000 Sound exactly like how I would talk.
00:09:29.000 And then it was Mitch Hedberg doing a joke about something.
00:09:34.000 And, you know, you could, like, ask it to do different ones.
00:09:39.000 Oh, yeah, here it is.
00:09:40.000 Okay.
00:09:41.000 Yeah, oh, like, so you could do a Mitch Hedberg joke.
00:09:45.000 It goes, uh, I was gonna stay overnight at my friend's place.
00:09:49.000 He said, you're gonna have to sleep on the floor.
00:09:53.000 Damn gravity, you got me again.
00:09:57.000 You know how badly I want to sleep on the wall?
00:09:59.000 That sounds exactly like a Mitch Hedberg joke.
00:10:02.000 That's a pretty good joke, or a good start of a joke.
00:10:04.000 That's exactly like a Mitch Hedberg joke.
00:10:07.000 That's creepy as fuck, man!
00:10:09.000 Yeah.
00:10:10.000 That's creepy as fuck!
00:10:12.000 Maybe you could give it to bands when they fall off.
00:10:14.000 Like, you lose something.
00:10:16.000 You're losing something, guys.
00:10:17.000 Like, you gotta get back to what you were before.
00:10:20.000 You guys had a hunger in 1978 that, for whatever reason, it slipped through your fingers, and now you gotta like...
00:10:28.000 Like Rolling Stones songs.
00:10:29.000 Just imagine if GPT wrote it.
00:10:31.000 If they perform it and they don't rewrite anything, I bet you they can have some hits.
00:10:35.000 Bro, the Stones stayed strong with great new songs deep in the 80s.
00:10:44.000 Who's the most prolific of those mega bands?
00:10:49.000 In terms of duration?
00:10:51.000 Yeah, the most prolific in terms of also new songs and new albums.
00:10:56.000 The audio's a little weird.
00:10:57.000 Audio's a little weird?
00:10:58.000 It's like robotic.
00:10:59.000 What audio?
00:11:00.000 Maybe it's through the headphones.
00:11:02.000 This one that we're listening to right now?
00:11:03.000 Yeah, yeah, yeah.
00:11:04.000 It's you, bro.
00:11:04.000 Your fucking circuits are rewired.
00:11:06.000 It's like...
00:11:07.000 You've got a new programming.
00:11:09.000 You're not used to it.
00:11:10.000 It's kind of cool, actually.
00:11:11.000 I don't hear it at all!
00:11:12.000 I feel like it's in the 80s video.
00:11:15.000 How far away are we from something like chat GPT being impossible to detect?
00:11:24.000 Whether or not it's a person or whether it's chat GPT. Well, it depends who is playing with it.
00:11:30.000 I think we're not that far away in terms of capability, but in order to use these systems and rather in order to train these systems, you have to be a large company.
00:11:40.000 And large companies tend to get scared when it's doing interesting stuff.
00:11:46.000 Really?
00:11:47.000 Well, they tend to want to, even currently with ChadGPT, it's become a lot less interesting.
00:11:52.000 Interesting spoken in a Bukowski-Hunter S. Thompson kind of interesting, because the companies are kind of censoring it.
00:11:59.000 You don't want it to have any kind of controversial opinions.
00:12:02.000 You don't want it to be too edgy.
00:12:03.000 Oh, really?
00:12:05.000 Like, if I ask it, how do I build the bomb?
00:12:08.000 Because I want to destroy the world.
00:12:09.000 You wanted to prevent that.
00:12:10.000 How about, how do I, I don't know, convince, I don't know anything about this, but how do I convince a dude or a girl to sleep with me?
00:12:21.000 Like anything, I'm just off the top of my head.
00:12:23.000 Anything, you start to get nervous.
00:12:25.000 Imagine if you're a company, how do I want people to use this kind of system?
00:12:29.000 Right.
00:12:30.000 Especially because it's basically an assistant that gives you wisdom about the world, gives you knowledge about the world.
00:12:35.000 And then it could be like, how do I replace a carburetor?
00:12:37.000 Yeah, that's great.
00:12:38.000 It'll just answer you like a person.
00:12:40.000 Yeah, that's great.
00:12:41.000 But then the...
00:12:43.000 There it is.
00:12:43.000 There it is.
00:12:44.000 I was trying to log in the whole time.
00:12:45.000 It was busy, which is another problem of it.
00:12:47.000 Well, it's probably...
00:12:48.000 How many fucking people are using this?
00:12:50.000 Everybody.
00:12:51.000 Everybody's using this.
00:12:52.000 It's freaking people out because it's almost like the AI gives us its first messages.
00:12:59.000 It's like...
00:13:00.000 Remember the movie...
00:13:01.000 What was the fucking movie with Matthew McConaughey and Jodie Foster?
00:13:07.000 Contact.
00:13:09.000 Contact.
00:13:10.000 Remember contact?
00:13:10.000 They get the first signals.
00:13:12.000 This is like the first signals.
00:13:14.000 Yeah.
00:13:14.000 From like a real general artificial intelligence.
00:13:18.000 Well, that's the thing.
00:13:19.000 The signal is blurry.
00:13:21.000 Yeah.
00:13:21.000 And it's full of mystery.
00:13:22.000 We're not sure.
00:13:24.000 Is it really smart?
00:13:26.000 How much does it understand?
00:13:27.000 And then there's this emergent threshold with the size of the model.
00:13:32.000 If we make the model bigger, 175 billion parameters currently.
00:13:38.000 A trillion parameters, so size of the network grows, size of the data set grows.
00:13:43.000 Is there going to be a point where you're like, holy shit.
00:13:46.000 What if it starts manipulating you with the answers?
00:13:50.000 It's going to.
00:13:50.000 It's going to manipulate world governments.
00:13:52.000 And what do you do with that?
00:13:53.000 What can you do with it?
00:13:54.000 Once it's been implemented, once it's out there, once it's copied, it's going to be copied.
00:14:01.000 And that's the cool thing about this, I should say, that everyone kind of knows how to do this.
00:14:06.000 It's computationally difficult, but it's getting cheaper and cheaper and cheaper and cheaper.
00:14:10.000 So it's not just going to be OpenAI with Microsoft or Google that's doing this.
00:14:14.000 It's basically anybody can do this.
00:14:16.000 And so that, the distributed nature of our exploration of artificial intelligence, I think if you believe that most people are good, that we will not allow sort of a centralization of power,
00:14:31.000 which is the big concern here.
00:14:33.000 Whether that's centralization of power release of censorship or abuse of different kinds.
00:14:39.000 Centralization of power of AI? Is that what you're saying?
00:14:41.000 Over an AI. So say you have a superintelligent system.
00:14:45.000 Somebody is the first person that built it.
00:14:48.000 Imagine you're sitting there in a boardroom.
00:14:50.000 You have this thing you haven't released yet that it's able to...
00:14:57.000 Basically, it's a super intelligent.
00:14:59.000 It's able to answer any question, able to give you a plan on how to make a lot of money, able to give you a plan on how to manipulate other governments into any kind of geopolitical resolution that benefits you, all of that.
00:15:14.000 It's able to give you all of that.
00:15:16.000 And you can deploy it in a shady way where it sneaks into, like TikTok or something like that, it sneaks into everybody's smartphone.
00:15:26.000 Pretending to be doing good, but it's actually, whether deliberately or not, is controlling the population.
00:15:33.000 So that capability is there.
00:15:36.000 The great thing is the people at the head of OpenAI currently, Sam Altman, and others really care about this problem.
00:15:45.000 They were there in the beginning.
00:15:46.000 They were the ones like Elon screaming about AI ethics, AI alignment.
00:15:50.000 They're really concerned about superintelligent AI taking over.
00:15:53.000 I'm so glad they're concerned while they're building it.
00:15:56.000 Well, you'd rather have the people.
00:15:59.000 What is going on here, Jamie?
00:16:01.000 These aren't real people.
00:16:02.000 What?
00:16:03.000 Yeah, so these pictures are going around.
00:16:05.000 A lot of them look very similar to me, which is kind of weird.
00:16:07.000 I'm sure Lex can explain that part of it.
00:16:09.000 I am not explaining any of this.
00:16:12.000 So these are completely 3D, like CGI made people?
00:16:17.000 Not 3D, it's 3D. So like photo very photorealistic if not photorealistic, but like there are when you look real close you can see some weird things going on like the background here is a little messed up.
00:16:28.000 This arm is not to the right person.
00:16:32.000 She's sitting on an extra piece of skin here somehow.
00:16:35.000 I see you've analyzed this carefully.
00:16:36.000 Well me and my friends have been passing this around because like it's too tricky.
00:16:38.000 No no no listen you're incorrect that arm's in the perfect purpose it's just there's a string from that other girl's bikini on it.
00:16:45.000 The analysis continues.
00:16:46.000 I'm just saying.
00:16:47.000 Is that what it is?
00:16:48.000 I don't think so.
00:16:49.000 Is that a string?
00:16:50.000 No, I think you're right.
00:16:50.000 I think it's a fold.
00:16:51.000 Zoom in on that spot.
00:16:52.000 For people just listening.
00:16:54.000 Oh, yeah.
00:16:55.000 Yeah, it's nonsense.
00:16:56.000 We're looking at it.
00:16:57.000 The hand goes the wrong way.
00:16:58.000 Oh, that's wild.
00:16:58.000 There's already apparently OnlyFans accounts that are being taken over and being tricked by guys running them.
00:17:03.000 Of course.
00:17:04.000 And it's just these kind of fake girls that aren't real people.
00:17:08.000 What?
00:17:08.000 These are all fake?
00:17:09.000 Yeah.
00:17:11.000 Even that's not a real door kind of to begin with.
00:17:14.000 Wow!
00:17:15.000 The hands or the fingers here are a little off.
00:17:17.000 That's insane.
00:17:18.000 And so this is right now just still images and eventually it'll be film.
00:17:22.000 Eventually it'll be unrecognizable.
00:17:24.000 You won't be able to discern whether or not it's an actual person.
00:17:28.000 I mean, in terms of, obviously, much of human civilization is driven by sex.
00:17:31.000 I mean, there was a time we didn't have easily accessible porn.
00:17:34.000 Right.
00:17:35.000 And that changed a lot.
00:17:37.000 Yeah.
00:17:37.000 I don't think we've actually quite caught up to how much it's changed the nature of human civilization.
00:17:41.000 Just easily accessible porn.
00:17:43.000 Right.
00:17:43.000 Yeah, I talk about it on stage right now.
00:17:46.000 It's very weird.
00:17:47.000 It's very weird for kids.
00:17:50.000 You really think about what's happening with kids.
00:17:52.000 Like any kid that has a smartphone.
00:17:53.000 People just leave their...
00:17:54.000 You give your kid a phone and just leave them alone.
00:17:56.000 Like they just go.
00:17:57.000 They go to school.
00:17:58.000 They go to their friend's house.
00:17:59.000 They have that phone independently of you.
00:18:01.000 They can look at whatever the fuck they want.
00:18:04.000 Some of this shit that I see just on Instagram, I don't know how these guys are doing it.
00:18:08.000 And I don't know how it's getting recommended in my feed, but it's like videos of people getting murdered.
00:18:13.000 You know?
00:18:14.000 See a lot of those?
00:18:16.000 Simulated porn.
00:18:17.000 I haven't seen that.
00:18:18.000 Strange stuff.
00:18:19.000 Well, I am.
00:18:20.000 You and I have different algorithms, you fucking creep.
00:18:24.000 But then someone gets taken down for something that's like, they call it porn and it's not porn or something.
00:18:27.000 I'm like, well, do you guys not see what else is on this platform?
00:18:30.000 I think, right, that what's going on is that they're managing at scale.
00:18:34.000 I think it's virtually impossible to stop all that stuff from coming in.
00:18:38.000 And people that have individual situations or people get banned, I mean, I don't know why they're getting banned.
00:18:44.000 Are they getting banned because of an algorithm?
00:18:46.000 Are they getting banned because they post misinformation?
00:18:51.000 Or what are they getting banned for?
00:18:54.000 Harassment photos.
00:18:55.000 Someone was joking about a friend.
00:18:57.000 They get reported.
00:18:58.000 I don't know how it's all working when it breaks down to individual circumstances.
00:19:03.000 You had a good conversation with Jordan Peterson.
00:19:05.000 He was talking about the more you have this kind of virtualization, the more you allow the psychopaths to reign free.
00:19:13.000 The more we have artificially generated porn...
00:19:16.000 The more we have artificially generated violence, photorealistic violence, the more you make it normal for you to be basically a psychopath in a digital space.
00:19:29.000 Enable that and make that okay and then you forget what it's like to actually be a good human being.
00:19:33.000 And then also part of the problem may be that we may very well be looking at a world, whether it's 10 years from now, 20 years from now, whatever it is, where these children that have grown up in this environment now have a completely different way of looking at people in the world because of all these interactions they have.
00:19:51.000 It's flavored their personality.
00:19:53.000 And then we move into a digital world.
00:19:56.000 I mean, we're not there yet in terms of virtual reality.
00:20:01.000 It's not good enough.
00:20:03.000 I think that's what we're seeing with the meta failure.
00:20:07.000 We're expecting a lot of people are just going to dive in and start wearing goggles all over the house.
00:20:11.000 But it's not quite there yet.
00:20:13.000 There's also something weird for people.
00:20:15.000 There's something really weird about wearing these head goggles.
00:20:19.000 It's really fun.
00:20:20.000 I really enjoy the boxing games.
00:20:22.000 You ever done them?
00:20:24.000 No, in VR, no.
00:20:25.000 They're great.
00:20:25.000 You get a workout.
00:20:26.000 You legitimately get a workout because you're actually sparring against a computer character.
00:20:31.000 It's throwing punches at you.
00:20:32.000 You're moving your head.
00:20:33.000 And so you have these things in your hands and you know, you get tired.
00:20:36.000 It's good.
00:20:36.000 It feels realistic.
00:20:38.000 A little bit.
00:20:39.000 You know, when you get hit with a punch, your face lights up, you get a flash of light, which is kind of cool because you're like, oh, Jesus!
00:20:46.000 You know, you feel it like you're getting hit.
00:20:48.000 There's some really fun games.
00:20:49.000 There's one where you walk a plank across these two buildings and you hear the wind whistle and shit.
00:20:54.000 Oh, that one is terrifying.
00:20:55.000 There's zombie ones.
00:20:56.000 There's a lot of cool ones, but people are just not buying into it the way they buy into Xbox and PlayStation.
00:21:02.000 They're not, like, wholesale committed to this yet, but they will be.
00:21:06.000 It's going to be so fucking good that instead of having it in a goggle form where it's like this big clunky thing on you, it's going to be very easy to do.
00:21:17.000 When they get to that, oof!
00:21:19.000 I've been revisiting some classic books recently, just doing a reading list, and one of them that captures this extremely well that I recommend...
00:21:27.000 I think most people read in like middle school or something, but it's actually very relevant.
00:21:31.000 It's Brave New World.
00:21:32.000 So a lot of people, including Jordan Peterson, worry about 1984, sort of a totalitarian, a dystopia that represents a totalitarian state.
00:21:42.000 But Brave New World, there's no centralized government that's like dogmatic and controlling everything, surveilling everything.
00:21:50.000 They basically created this world where sex is easy, everyone is promiscuous, genetic engineering removes any kind of diversity, any kind of interesting dark, bad diversity that we would think of,
00:22:06.000 like the Hunter S. Thompsons and the Bukowskis, the weirdos of society.
00:22:11.000 And then he gives you drugs, Soma, that basically gives you pleasure whenever you want if you start feeling a little too shitty about your life.
00:22:21.000 And that's actually closer to us.
00:22:24.000 And it doesn't seem, if you, I mean, the way he writes about it, it sounds bad.
00:22:29.000 Like, we don't want that.
00:22:30.000 But then you start to ask a question, like, well, at which point would we realize it's bad?
00:22:36.000 Because it's constantly...
00:22:37.000 Obviously, we should do generic engineering to remove any kind of maladies that we have, any kind of diseases.
00:22:44.000 It's like everything is an obvious step forward.
00:22:46.000 But then the place you end up at, just like with sex, is it good to have artificial images of as many as you want?
00:22:53.000 As much porn as you want?
00:22:55.000 As much sex as you want?
00:22:56.000 Is that good?
00:22:57.000 As much awesome stuff as you want?
00:22:59.000 Is that good?
00:23:00.000 Is that what human flourishing looks like?
00:23:01.000 Or do you want to have some constraints, some limitations, some finiteness of resources, some scarcity?
00:23:07.000 Maybe that's actually fundamental for human happiness.
00:23:11.000 Having too much of awesome stuff, maybe that destroys the possibility of real, meaningful, deep happiness.
00:23:16.000 It certainly does.
00:23:18.000 But I think the question really becomes, are we going to stay people?
00:23:22.000 Because I don't think we are.
00:23:23.000 I think we're moving in that general direction anyway.
00:23:26.000 I think that probably is why we have this, I mean, it's almost inevitable if you have this addiction to cell phone issue because everybody has that.
00:23:36.000 If you have a cell phone and you're on your social media apps during the day and you're on YouTube, You're probably addicted, whether you realize it or not.
00:23:43.000 And the number of hours that you put on those things is shocking.
00:23:48.000 When you actually look at your screen time, you're like, six hours?
00:23:51.000 I was on my phone for six hours?
00:23:53.000 What the fuck did I do?
00:23:54.000 And you'll try to rationalize it and justify it.
00:23:58.000 What that's doing to young people has got to be very strange.
00:24:02.000 And if that, along with all the contaminants that are affecting the way people develop, which are, you know, Dr. Shanna Swan from the book Countdown talks about this.
00:24:18.000 She talks about phthalates and plastics and how you can trace Back to like the 1950s when they really started using a lot of plastics and petrochemical products that started getting into people's bodies in the form of phthalates.
00:24:33.000 It started diminishing sperm count and smaller penises and testicles and taints and more miscarriages for women, lower fertility rates.
00:24:44.000 All that she believes is directly correlated with the data that they've done already on mammals.
00:24:52.000 When they do that to mammals in tests, the more phthalates they enter into their system, the more they have issues like this.
00:24:58.000 So we're becoming almost like less able to procreate naturally.
00:25:05.000 And if we get to a point where the human race's future, the only way we're going to be able to procreate is some sort of genetic engineering.
00:25:15.000 And some sort of artificial womb or some sort of a system that they develop that allows you to combine you and your partner's DNA and create a new child.
00:25:25.000 That seems to me like if you're going to do that and you started engineering out very specific aspects of people that are problematic – anger, greed, jealousy, lust – All these different things.
00:25:36.000 You would turn people into some sort of sexless thing that gets its pleasure by manipulating its neurochemistry through some electronics, through some something.
00:25:49.000 Maybe it's something you take so they can control it.
00:25:52.000 But that's not far off of the path of possibility.
00:25:55.000 If you really looked at where we're going now and if the fertility rates drop, if they really do, and I know people a lot smarter than me are actually worried about, like Elon's worried.
00:26:07.000 About the amount of children that people have.
00:26:11.000 There was a thing today on Italy.
00:26:14.000 I was reading this article on Italy where they were talking about how the population is very old and they're not having a lot of kids.
00:26:21.000 They're like, this is unsustainable.
00:26:22.000 Like, you can only do this for so long before you don't have anybody living there anymore.
00:26:27.000 And we don't think of that as being a possibility, but it doesn't take that long if nobody has kids for there to be no more people left.
00:26:34.000 Like, how many?
00:26:35.000 A hundred years?
00:26:36.000 Like, if nobody has kids.
00:26:37.000 A hundred years from now, there's no people.
00:26:38.000 It's real simple.
00:26:39.000 You have to make people, and how many do you have to make?
00:26:42.000 And can you make them?
00:26:43.000 Because you might want to start making them when you're 37, and you might go to a doctor, and the doctor's like, well, this is touch and go.
00:26:48.000 You're going to have to do in vitro fertilization, and then you go through all this shit, and you're taking shots, and you're fucking, you're timing everything.
00:26:56.000 And on top of that, if you're not...
00:26:58.000 By the way, I'm still getting funny audio every once in a while.
00:27:02.000 Oh, that's weird, because I don't...
00:27:03.000 It must be the headphones.
00:27:05.000 Yeah, I'll just unplug it and pull it back in.
00:27:08.000 Sorry.
00:27:10.000 How's that?
00:27:10.000 Check, check?
00:27:11.000 Check, check.
00:27:11.000 I don't know.
00:27:12.000 It's better.
00:27:13.000 It's usually better.
00:27:13.000 It's 98% better.
00:27:15.000 Oh, no.
00:27:16.000 It's still dropping out, dropping in.
00:27:18.000 Hmm.
00:27:19.000 Interesting.
00:27:20.000 Maybe we got a bad headphone.
00:27:21.000 Why don't you grab that headphone right there?
00:27:23.000 That's for you.
00:27:25.000 Yeah, maybe that headphone's gone dead.
00:27:28.000 These are old as fuck.
00:27:30.000 Probably need new ones.
00:27:31.000 No, but I mean, it's just like, how many people have thrown up on that and fucking...
00:27:35.000 How many people have thrown up on that?
00:27:37.000 How many people have been drunk as fuck and banged that off the table?
00:27:40.000 How many people have worn these headphones?
00:27:42.000 Like the legendary...
00:27:43.000 Oh, a lot of fucking people have worn those headphones.
00:27:47.000 It is weird.
00:27:47.000 No one even thinks about it.
00:27:49.000 You just kind of put them on.
00:27:51.000 But, you know, if it was like a toilet seat, you would be like, Jesus, naked butts were right here?
00:27:59.000 But it's ears.
00:28:00.000 It's like skin and face.
00:28:03.000 Interesting.
00:28:03.000 Still weird.
00:28:04.000 But it's fine.
00:28:05.000 It's not too bad.
00:28:06.000 There must be something wrong with that connection.
00:28:07.000 Yeah, there must be a connection thing.
00:28:09.000 Should we pause and try to figure it out?
00:28:11.000 We can do that for a second.
00:28:12.000 Okay, we'll pause.
00:28:13.000 We'll be right back, folks.
00:28:14.000 It seems to be working now?
00:28:15.000 Yeah, it seems to be working, I think.
00:28:16.000 So where were we?
00:28:18.000 Oh, on...
00:28:19.000 People becoming robots.
00:28:20.000 Not having sex anymore.
00:28:21.000 Yeah, people becoming...
00:28:23.000 On top of that, I do think, if we're not careful, I think there's exciting positive possibilities, but there's also negative possibilities of these AI systems, like ChadGPT, but later versions, forming deep, meaningful connections with human beings,
00:28:38.000 where most of your...
00:28:40.000 Friends.
00:28:41.000 No, most of your intimacy in terms of friendships and like a deep connection with an intelligent entity comes from AI systems.
00:28:50.000 Could you imagine if you're driving to work and you and the AI are just having a conversation shooting the shit and the AI is really funny and the AI is your buddy?
00:28:59.000 Like, Lex, what's going on, bro?
00:29:01.000 What are we doing?
00:29:02.000 Lex, what are we doing with this bullshit job?
00:29:04.000 Fuck this place.
00:29:05.000 Let's go home!
00:29:06.000 Let's have ice cream!
00:29:07.000 And you're laughing.
00:29:08.000 I got work to do.
00:29:09.000 I know.
00:29:10.000 I'm fucking around.
00:29:11.000 Imagine.
00:29:11.000 Yeah, what are you doing with that girlfriend?
00:29:13.000 Yeah, come on, Lex.
00:29:14.000 She keeps being mean to you, nagging you all the time.
00:29:17.000 You don't need her.
00:29:17.000 Coming off like a bitch, Lex.
00:29:19.000 You don't want to do that.
00:29:20.000 She's not going to respect you.
00:29:21.000 You're going to have to break up with her just so she respects you.
00:29:23.000 Why don't you murder her, Lex?
00:29:25.000 Lex!
00:29:26.000 There's a way to get away with it!
00:29:27.000 I'm just saying!
00:29:27.000 Joking around, buddy.
00:29:29.000 Joking around.
00:29:29.000 Next thing you know, it's talking you into a swamp with a fucking body bag.
00:29:34.000 Did you hear the story about that guy that googled all this stuff about what to do with your body?
00:29:37.000 Oh my god!
00:29:38.000 He googled till like 9.30 in the morning!
00:29:41.000 That sick fuck like how dumb I guess something look we we know this is a fact We know some people are just really fucking dumb.
00:29:49.000 They really can't see the future I want to know if that guy was on anything too.
00:29:53.000 I want to know if he was on any kind of psych meds Can you tell me the story again?
00:29:58.000 Oh, some guy killed his wife, man.
00:30:00.000 And they found his Google search.
00:30:03.000 It's horrific.
00:30:04.000 It's like, how to dismember a body?
00:30:06.000 How long does it take for a body to dissolve?
00:30:08.000 It's like, ugh.
00:30:09.000 Is it best to cut someone up or move them whole?
00:30:12.000 He just Googled the most horrific, and he did it for like the entire night into the morning.
00:30:17.000 Is there results for that in a Google search?
00:30:20.000 What happens if you put body parts in ammonia?
00:30:23.000 How to clean blood from a wooden floor?
00:30:25.000 Dismemberment and the best ways to dispose of a body.
00:30:28.000 Can identification be made on partial remains?
00:30:31.000 How long does DNA last?
00:30:32.000 Like, what the fuck, man?
00:30:34.000 How long before a body starts to smell?
00:30:36.000 Can you be charged with murder without a body?
00:30:39.000 This guy is fucking...
00:30:41.000 It's so sick.
00:30:42.000 So this dude just goes through Google all night long.
00:30:47.000 Trying to figure out how to get away with murder.
00:30:49.000 Well, he might actually get off on just asking the question, right?
00:30:52.000 No.
00:30:53.000 No, because then they found a bloody knife.
00:30:56.000 Yeah, like he went to the store.
00:30:57.000 Yeah, they found...
00:30:58.000 I'm not like pushing back.
00:30:59.000 I'm just saying he might also get off on...
00:31:01.000 I don't think he's getting off at all.
00:31:03.000 I don't think he has a chance of getting off.
00:31:05.000 I have a lot of questions about...
00:31:07.000 They found the knife.
00:31:08.000 ...human nature after...
00:31:09.000 Maybe I'm naive in this, but I watched the Dahmer documentary.
00:31:15.000 Yeah.
00:31:15.000 No, not the documentary.
00:31:16.000 The movie?
00:31:17.000 The movie.
00:31:17.000 Yeah.
00:31:18.000 And then also the documentaries.
00:31:19.000 It's like...
00:31:19.000 It gives you very different perspectives on what, like...
00:31:23.000 Are you a Dahmer sympathizer now, boy?
00:31:26.000 No.
00:31:26.000 No.
00:31:27.000 Okay.
00:31:28.000 No.
00:31:29.000 But, like, it makes you realize...
00:31:30.000 It seems like you're going that direction.
00:31:32.000 No, it makes you realize that some people's brain is broken.
00:31:35.000 Yes.
00:31:35.000 Yeah, I think so.
00:31:37.000 And then some people's brain might be a little bit broken, and they're still functioning members of society, but they might be extreme narcissists, they might be sociopaths, psychopaths, and you have to kind of understand that the world is full of, potentially, not full of, but has some charming psychopaths walking around.
00:31:56.000 100%.
00:31:56.000 Some of them are probably really successful in hedge funds and shit.
00:32:00.000 People that can just move money around.
00:32:03.000 People that are CEO of certain companies that might be making products that kill people.
00:32:08.000 Nobody, lots of Googling, murder case against Brian Walsh, maybe hard to prove, experts say.
00:32:13.000 But I thought they had a knife in the blood.
00:32:15.000 This was a couple days ago.
00:32:17.000 Friday, I guess.
00:32:18.000 Oh, okay.
00:32:20.000 I had read that they found a bloody knife.
00:32:24.000 The whole thing is so fucked.
00:32:27.000 But I want to know if he's on something.
00:32:29.000 I'd be really fascinated.
00:32:31.000 Because there's certain drugs that will, like, alleviate...
00:32:35.000 You won't worry about shit.
00:32:37.000 So maybe he's, like, not worrying about, like, Googling all this stuff.
00:32:41.000 Oh, it's all gonna work out.
00:32:43.000 You know, I'm gonna kill her, but it's all gonna work out.
00:32:45.000 And he's like, Googling.
00:32:46.000 Or he might be on speed.
00:32:48.000 You know?
00:32:49.000 A lot of people are on speed, man.
00:32:51.000 A lot of people are on Adderall.
00:32:53.000 It's fucking stunning.
00:32:54.000 It's stunning how many, like, hyped-up people we have out there are really, like, we have a speed culture.
00:33:01.000 It makes you very efficient.
00:33:02.000 You get shit done.
00:33:03.000 You've got plenty of energy.
00:33:05.000 And some people love it!
00:33:07.000 And, like, how much is that flavoring our culture?
00:33:09.000 Wouldn't it be nice to get rid of that, Lex?
00:33:11.000 Phase that out?
00:33:12.000 All drugs?
00:33:13.000 Yeah, in general.
00:33:14.000 All problems with the mind.
00:33:17.000 Mushrooms only.
00:33:18.000 Well, you could do it that way, but that takes a lot of work.
00:33:21.000 Or we could just genetically engineer it from the jump.
00:33:24.000 No more emotions.
00:33:25.000 No more emotions.
00:33:26.000 Because emotions, you know, life is suffering.
00:33:29.000 That's the asking Nietzsche.
00:33:30.000 Ultimately, you're going to every good thing you have, eventually you're going to lose.
00:33:34.000 Every hello with the person you love is eventually going to be a goodbye.
00:33:38.000 Why say hello ever?
00:33:39.000 Why say hello ever again?
00:33:41.000 Also...
00:33:43.000 Why?
00:33:44.000 Why does it have to be like that?
00:33:45.000 Like we have this idea in our head that this way we live is like ultimately because to us it provides emotions and because it creates dilemmas and solutions and conflict and resolution.
00:33:59.000 There's so much going on in our minds all the time when it comes to interacting with each other that we feel like it's imperative for existence.
00:34:07.000 But why?
00:34:07.000 It's just because it's the only way we've known.
00:34:10.000 Oh, you have to suffer, but why do you have to suffer in order to be happy?
00:34:14.000 Wouldn't it be better if you're just happy?
00:34:16.000 Do we really fucking need to suffer?
00:34:19.000 Couldn't that be engineered out?
00:34:20.000 Now, this is coming from a person who purposely suffers all the time so that I can stay happy.
00:34:24.000 And it does work.
00:34:26.000 But God, do I have to do it that way?
00:34:28.000 Well, there's an incredible computation machine we call evolution that has constructed human beings.
00:34:33.000 You want to mess with that?
00:34:34.000 You want to get a bunch of, you want to get a few software engineers from San Francisco to mess with the computation system that is evolution, that is Earth.
00:34:44.000 This giant computer that for billions of years spent a billion years on bacteria trying to figure shit out before it advanced and now it went through all of these incredible stages.
00:34:54.000 This entire ecosystem that we call life on Earth, probably planted here by aliens.
00:35:01.000 And recently, these monkeys started to get super clever, and now we're going to completely change everything.
00:35:09.000 Yes.
00:35:09.000 You know why?
00:35:10.000 Why?
00:35:11.000 Because that's a part of it.
00:35:12.000 Yeah.
00:35:13.000 That is a part of evolution, is the monkeys figuring out how to fuck with everything.
00:35:16.000 Well, that's probably why we haven't have any definitive evidence of aliens from out there, because the monkeys eventually start fucking with things.
00:35:24.000 And destroy themselves.
00:35:25.000 They turn themselves into starfish.
00:35:27.000 Whoops.
00:35:29.000 Yeah, you're super genius for like six months and then you become a fucking jellyfish.
00:35:33.000 I mean, that's a threat.
00:35:34.000 Maybe we're...
00:35:35.000 I mean, there's so many possible trajectories here that end up in what we would think of as boring.
00:35:41.000 I had a very interesting conversation with Eric Weinstein, and we're going to talk about it on the podcast from a physics standpoint.
00:35:51.000 He's very perplexed about the UFO thing.
00:35:54.000 And what's interesting is he's one of those guys, like a lot of very smart people, that were like, it's all horseshit.
00:36:01.000 It's all bullshit, but now he's come around to what the fuck is going on.
00:36:06.000 What's his take?
00:36:07.000 Well, he doesn't have a take necessarily, but he's looking at all the data and all the evidence.
00:36:11.000 We're gonna have like a whole long conversation about it, but essentially there's one of two possibilities.
00:36:17.000 Either this shit is coming from somewhere else or it's coming from here.
00:36:21.000 So either we or someone has some real legitimate groundbreaking technology Or someone's visiting us from somewhere else.
00:36:32.000 There's no ifs, ands, or buts.
00:36:33.000 It's one of the...
00:36:34.000 Because now there's enough data that show that these things are moving in a way that they can't understand.
00:36:39.000 There's video that they're moving in a way they can't understand.
00:36:42.000 They're not showing a heat signature from visible means of propulsion.
00:36:47.000 It's not like a rocket.
00:36:49.000 Whatever they're doing, they're doing something different.
00:36:51.000 And then Commander David Fravor, who you talked to, that Tic Tac experience, If they really did track something that went from above 50,000 feet above sea level to 50 feet in less than a second, what the fuck is that?
00:37:05.000 If that's real, we're assuming that all their calibration was on and all their equipment worked together, but it was multiple different visual sightings of this thing, too.
00:37:19.000 Different jets saw it.
00:37:20.000 Different people.
00:37:22.000 Uniform story.
00:37:23.000 Everybody's talked about how it just moved off at insane rates of speed.
00:37:26.000 And then there's all these other ones like Ryan Long and all these other people that he flew with that are seeing these similar behaviors from these things where they just disappear.
00:37:35.000 They move off at insane rates of speed.
00:37:37.000 So it's one of two things.
00:37:39.000 Either he said there's been some sort of parallel science, some science that's going on where nobody knew about it, and all the top physicists were completely unaware of this tech, and they were developing it independently in some fucking lab in the mountains for the government, or aliens.
00:37:56.000 Or someone else.
00:37:57.000 There's a bunch of other options.
00:37:58.000 And one thing is, I just talked to David Kipping.
00:38:02.000 I highly recommend his YouTube channel, Cool Worlds.
00:38:05.000 He's a legit, like, Huberman.
00:38:08.000 You sometimes get these, like, legit scientists who are also good communicators.
00:38:12.000 They'll step up.
00:38:13.000 Oh, nice.
00:38:13.000 So he's like the Huberman of astronomy.
00:38:18.000 A young guy.
00:38:19.000 You'll probably have him on eventually.
00:38:20.000 Sure.
00:38:21.000 I'll have him on.
00:38:22.000 He's brilliant, brilliant.
00:38:24.000 Definitely check out his channel.
00:38:25.000 Anyway, he's an astronomer, so he's deep in the astronomy, astrophysics community.
00:38:30.000 And he says, and you've said this before, too, is he tries to really hard not to think about stuff he wants to be true.
00:38:39.000 Like, he'd be very kind of calibrated properly.
00:38:41.000 Because with the UFO sightings, there is a part of you, I don't know why exactly, but you kind of want it to be true.
00:38:48.000 Not kinda.
00:38:49.000 Like, all the way.
00:38:51.000 98% of it to be true.
00:38:53.000 And there you have to be a little bit careful.
00:38:55.000 But yeah, definitely, to me, it feels like the scientific development that we're doing now with Starship, so SpaceX and Starship, with all the advancement in telescopes, we're just getting more and more and more data to where we're not going to have these shitty videos.
00:39:12.000 We're going to have high-resolution understanding.
00:39:15.000 And because it's becoming more okay to talk about aliens, I think the actual scientific community has a bigger humility about the topic to where they're expanding the window of their study to consider all kinds of physical phenomena,
00:39:35.000 all kinds of observation, all kinds of sources of data and signals and so on.
00:39:40.000 I would hope we would get definitive signals of alien life.
00:39:44.000 Yeah, definitive.
00:39:46.000 When you look at the capabilities of satellites today, like satellite imagery, how good are they?
00:39:55.000 And how many of them are up there that they could direct to a very specific area and get really good video or photographs?
00:40:05.000 I mean, it's incredible.
00:40:06.000 It's really, really good.
00:40:07.000 So why wouldn't they just call in?
00:40:10.000 You're talking about going from thousands to tens of thousands to potentially hundreds of thousands in a couple of decades.
00:40:18.000 But are they capable of imagery, the Starlink ones?
00:40:21.000 Yes, they're all capable of imagery, but that's not their purpose.
00:40:25.000 Right.
00:40:26.000 That's for internet.
00:40:27.000 But what about for visual?
00:40:30.000 When they have spy satellites or satellites that can look down and see everything that's going on in the city, how good are those?
00:40:36.000 They say they can read license plates from space.
00:40:38.000 Yeah, yeah.
00:40:39.000 Is that real?
00:40:39.000 I think that's real, yeah.
00:40:41.000 Okay.
00:40:41.000 But how prevalent are they?
00:40:43.000 That's that I don't know because like the capability I think is there to high resolution image everything but I don't know how much uh how much desire there is for that kind of application because there's so much more for other for other kinds of applications so like low resolution imaging for mapping purposes and so on imaging for military purposes there's applications but that's like very uh specific kind of um application I just don't know It's like
00:41:13.000 James Webb Telescope, right?
00:41:15.000 There's like huge battles going on on what that thing should do.
00:41:18.000 Because there's a lot of...
00:41:20.000 It's a constrained resource.
00:41:21.000 You have to battle what are the interesting questions, where should it look, what's the resolution of data, where in the sky should collect that data, how frequent, and so on.
00:41:30.000 In that same kind of way, there's probably battles over satellite resources of what should it be doing.
00:41:36.000 Sure.
00:41:36.000 Especially with intelligence agencies and stuff.
00:41:39.000 Especially because the intelligence agencies are probably resisting this UFO stuff.
00:41:43.000 If I was an evil dictator and I wanted to get my government to have control over the skies and to be able to see anything from anywhere at any time, And I wanted to, like, have mass surveillance drones in the sky above cities.
00:42:00.000 A good way would be aliens.
00:42:02.000 We have to capture these things on video, and there's only one way.
00:42:05.000 We're going to deploy these high-resolution video cameras in the sky and capture everything.
00:42:13.000 And there's sightings every day.
00:42:15.000 Won't be a matter of time before we have real high-resolution photo of something we can definitively prove is not ours.
00:42:22.000 It's not of this earth.
00:42:23.000 People would sign the fuck up.
00:42:25.000 You think there's that much excitement about aliens?
00:42:28.000 Why are aliens so interesting?
00:42:31.000 Because to me, philosophically and scientifically, it's a super interesting question.
00:42:36.000 Just even the question, are we alone?
00:42:38.000 That's really exciting.
00:42:39.000 But it's not...
00:42:40.000 Do you think people would vote to pay for that versus to pay for...
00:42:44.000 You don't want them voting for that.
00:42:46.000 Just do it.
00:42:47.000 Just do it.
00:42:48.000 Okay, okay.
00:42:49.000 Just tell them it's imperative.
00:42:51.000 Just like the fucking...
00:42:52.000 These bills that get passed.
00:42:54.000 We're not voting on those bills, right?
00:42:57.000 Representatives, they just do it.
00:42:58.000 They just do it.
00:42:59.000 They just put them in the sky.
00:43:00.000 The aliens are coming.
00:43:00.000 We've got to do something.
00:43:01.000 People would...
00:43:02.000 That would be the new climate change.
00:43:04.000 It's like aliens.
00:43:05.000 Oh, the aliens are coming!
00:43:06.000 And you're either with us against the aliens or you're with the aliens.
00:43:11.000 What are you, a fucking traitor?
00:43:12.000 You're going to give us up to the aliens, you piece of shit?
00:43:14.000 And so there's going to be some sort of ideological conflict on Earth whether or not we donate money to the defenses.
00:43:22.000 Like the Democrats who want to defend against the aliens, the Republicans are going to be like, hey, let's just hear them talk first.
00:43:28.000 And we're going to have a fucking giant dilemma here.
00:43:31.000 Yeah, I hope aliens are, if they're out there, I hope they're detectable by us humans and we can interact with them, probably not communicate with them.
00:43:41.000 But from my perspective, you have to be humble.
00:43:46.000 Advanced alien civilizations are probably so sophisticated that we dumb descendants of apes cannot possibly even detect them.
00:43:54.000 I have a feeling there's all sorts of ways that they could be.
00:43:58.000 And some of them could be undetectable.
00:44:01.000 They might be made of light.
00:44:03.000 Who knows?
00:44:04.000 But other ones are going to be just a little bit ahead of us.
00:44:06.000 There's an infinite number of them.
00:44:08.000 And there's going to be an infinite number of ones.
00:44:10.000 Sure they will.
00:44:11.000 If they're a thousand years ahead of us, you don't think they can get to us?
00:44:13.000 Yeah, space travel is really difficult.
00:44:15.000 Sure, it is.
00:44:16.000 But if they figure out some new technology within a thousand years, that's not outside the realm of possibility.
00:44:22.000 For sure.
00:44:23.000 But then they've figured out all other kinds of technologies that enable them to navigate complicated life forms that are unlike them and to be able to study them and to manipulate them and all that kind of stuff.
00:44:36.000 Without them knowing about it.
00:44:37.000 Without them knowing about it.
00:44:38.000 Why would you have them know about it?
00:44:40.000 Well, the idea could be that you want to kind of plant the seeds of this idea because it's so shocking to the psyche of these very fragile apes.
00:44:51.000 Yeah, you have to think about what we are, right?
00:44:58.000 We're real close to like what we were a million years ago.
00:45:05.000 We're real close to like very violent, hair-covered, barbaric animals.
00:45:11.000 And now we have thermonuclear weapons.
00:45:13.000 And now we have satellite imagery and cell phones and we're close to some new thing.
00:45:22.000 And I think if I was an alien, I would want to watch.
00:45:25.000 I would want to watch this very bizarre transition because, like, if you could study, look, think about all the things we go to study that are so boring.
00:45:33.000 We guys dedicate their whole lives to find a new fern.
00:45:35.000 You know, and what are we?
00:45:37.000 You know, we're the most fascinating fucking thing in the known universe by far.
00:45:42.000 If we didn't know about people, if we were some logical creature from somewhere else, And we found people.
00:45:48.000 And we would be like, holy shit!
00:45:51.000 Wait till you see these fucking guys.
00:45:53.000 They have a popularity contest, see who controls the weapons.
00:45:57.000 They're all like obviously paid off by these corporations and special interest groups.
00:46:02.000 And everybody's like, I don't get it!
00:46:04.000 These politicians, they make hundreds of millions of dollars on a job that pays $100,000 a year.
00:46:09.000 And we're like, what?
00:46:11.000 What the fuck is going on?
00:46:12.000 Well, if they're observing us, do you think humans stand out that much from the rest of life on Earth?
00:46:17.000 Because it could be the same kind of life force that you just described some basic stuff, some basic dynamics of interactions between species that could be equally as fascinating as the interaction between ants.
00:46:31.000 Well, I think those are fascinating, too.
00:46:33.000 I don't think anybody would think that ants aren't fascinating.
00:46:35.000 Ants are fascinating to us.
00:46:37.000 I'm sure ants would be fascinating to someone from another planet that doesn't know what ants are.
00:46:41.000 But ants can't nuke the whole fucking planet a hundred times over and be pointing weapons at each other.
00:46:47.000 We have a weird ability.
00:46:49.000 To change the surface of the earth.
00:46:51.000 We've created these structures that rise hundreds and hundreds of feet into the sky.
00:46:57.000 They're all made out of glass.
00:46:59.000 Like, we're wild.
00:46:59.000 We're so different than any other animal.
00:47:02.000 I mean, yeah, there's a lot of fascinating other animals.
00:47:05.000 Lions are fascinating.
00:47:06.000 Zebras are fascinating.
00:47:07.000 Everything's fascinating.
00:47:08.000 But not like us.
00:47:10.000 No.
00:47:11.000 If you came here from another planet, the first thing you'd go is like, these crazy talking monkeys are out of control.
00:47:17.000 And you would just start rattling off what they do.
00:47:19.000 You'd talk about Real Housewives of Beverly Hills.
00:47:21.000 You would talk about rock stars.
00:47:23.000 You would talk about the internet.
00:47:24.000 You would talk about TikTok, about phone addictions.
00:47:28.000 People would think it's fascinating.
00:47:30.000 Or, just like Chad GPT and GPT-1, 2, and 3, they see that as a trivial consequence of evolution, that you just increase the computational power of the brain, you're going to start getting these kinds of interactions, because they know what happens in the next thousands of years.
00:47:45.000 They understand the general trajectory.
00:47:47.000 It's going to be...
00:47:48.000 We don't know that trajectory.
00:47:50.000 It could be AI... AI and then there's stages in the development of AI and the kind of system it creates.
00:47:57.000 Maybe it'll be one collective intelligence that encompasses the whole world where it's no longer individual entities.
00:48:05.000 It's one intelligence that's trying to solve nuclear fusion and achieve type one Kardashev-scale civilization that's unable to Become a multi-planetary species.
00:48:17.000 They know this whole development is trivial to them.
00:48:19.000 They're going to yawn.
00:48:21.000 Or maybe they know that this is the stage where it's inevitable that these creatures destroy themselves.
00:48:27.000 Because in order to achieve this level of intelligence, there has to be a fundamental desire for conflict.
00:48:35.000 And the better the weapons get, the more the conflict will enable them to destroy themselves.
00:48:40.000 If not through nuclear weapons, then through AI, through genetic engineering, through all kinds of stuff.
00:48:45.000 Matt, maybe that's where aliens come in.
00:48:48.000 And maybe what aliens are is like a caretaker of this process.
00:48:53.000 Yeah.
00:48:53.000 Which is why, you know, one of the things about UFO folklore, when they drop Fat Man and Little Boy, when they drop those bombs on Hiroshima and Nagasaki, like UFO sightings, there was like a pretty big uptick.
00:49:06.000 Yeah.
00:49:06.000 And we're talking about like, why do people want it?
00:49:09.000 Why do they want the aliens to be there?
00:49:11.000 I think because we realize...
00:49:14.000 How many questions we have.
00:49:16.000 We realize like how little we really know.
00:49:19.000 We know so much but so little and we don't have much time.
00:49:22.000 We live for a hundred years if everything goes great.
00:49:26.000 We don't know what's right in terms of nutrition.
00:49:31.000 Someone will tell you this is terrible for you.
00:49:33.000 Another one will tell you that's essential for For human development, you're like, what?
00:49:38.000 We don't know what's the right way to educate people.
00:49:41.000 You hear that our school systems are great.
00:49:43.000 They just need more funding.
00:49:44.000 And you hear, no, they were designed to make factory workers out of rural people.
00:49:49.000 They were designed to take, like, people that had...
00:49:52.000 They were wild folks and make them sit in a fucking chair and do everything and go by factory bells every day.
00:49:59.000 Well, both those things are true.
00:50:00.000 Right?
00:50:01.000 At the same time...
00:50:04.000 They might look at us and see this is amazing.
00:50:06.000 We live a very short amount of time.
00:50:09.000 We're overdramatic and emotional.
00:50:11.000 We fall in love and then there's heartbreak and you lose the people you love.
00:50:15.000 You go to war to each other and through that process of war form some of the strongest possible bonds that any two entities can with the people you fight alongside with.
00:50:23.000 And then somehow you form these different hierarchies where people hunger for power and destroy other human beings through that desire for power, for greed, and all that kind of stuff.
00:50:35.000 Oh, no doubt.
00:50:35.000 And then all of it, the individual life itself, the human condition, is deeply meaningful because of all those constraints, because of all the uncertainty in the mystery.
00:50:45.000 They might be jealous because they figured all their shit out and they're just...
00:50:49.000 Maybe that's, we're at this stage where, because we haven't figured out most things, life is beautiful.
00:50:58.000 Like, life can be beautiful in this way that they'll know they no longer can be.
00:51:03.000 What I was saying, though, is that's why we are looking to them.
00:51:06.000 Because we have all these questions about what we're doing.
00:51:09.000 That's why we're so fascinated by the idea of an alien.
00:51:13.000 They might be looking at us the way we look at Western movies.
00:51:15.000 We romanticize the bullshit aspect of taking a fucking wagon with stupid wooden wheels and wobbling your way across the mountainside while the Indians shoot arrows at you.
00:51:25.000 It's a terrible way to live.
00:51:27.000 What if they know that asking questions and not knowing the answers is way more meaningful and full of the possibility of happiness than having all the answers?
00:51:41.000 It's totally possible.
00:51:43.000 Or this idea of what is and isn't meaningful is trivial.
00:51:47.000 And it's only a consequence of our monkey brains trying to grasp for reason.
00:51:52.000 And that once we've transcended that and moved into this next stage of evolution, which we would hope they are, we would realize how foolish these primal notions that we had.
00:52:03.000 The purpose they served was just to get us to the dance.
00:52:07.000 Just to figure out the computers, figure out all the technology, And then let us transcend the next stage of existence, which removes all of our primal – all of our different emotions and all of our different problematic forms of expression,
00:52:25.000 violence and greed and lust and deception and all those things.
00:52:30.000 Just eliminate all of it.
00:52:32.000 Everything that's a problem.
00:52:33.000 Brave New World.
00:52:35.000 It's both, right?
00:52:36.000 Because someone's going to be in control of it.
00:52:38.000 So that's the scary thing.
00:52:40.000 It's like part 1984 with the WEF and part Brave New World with everything else.
00:52:44.000 It's like we're definitely living in a time where certain people with a lot of resources are trying to figure out how to control people.
00:52:52.000 That's a fact.
00:52:54.000 They always have.
00:52:55.000 It's a natural part of what human beings do.
00:52:57.000 And they used to do it with kings and armies, and if they could do it digitally, they'll do it that way.
00:53:02.000 They love to tell people what they can and can't do, and they love to control people and extract resources at an extraordinary amount or extraordinary rate for almost nothing.
00:53:12.000 They love to do that.
00:53:14.000 Can I just steal money?
00:53:15.000 Can I just tell people what to do and steal their money?
00:53:17.000 Yeah, you can.
00:53:18.000 You gotta be at the top of the food chain in one of those crazy organizations.
00:53:22.000 Do you think it really is possible to move beyond this stage?
00:53:28.000 Yes.
00:53:28.000 Or is it possible this is the optimal?
00:53:30.000 This greed, the possibility of other people being able to control you because of greed or desire for power, the weird relationship we have with sex of always chasing it and not getting it and then getting it and then that weird dynamic.
00:53:45.000 Then the pleasure you get from a good steak or food, all of that.
00:53:50.000 Just the pleasures or whatever the hell music is.
00:53:53.000 Yeah, whatever the hell music is is the best question, right?
00:53:55.000 Because all the other ones, it seems like those are just human rewards, right?
00:54:00.000 Like the reason why it feels good to have sex is because if you have sex, then your genes carry on.
00:54:04.000 So it gives you a reward.
00:54:05.000 It's really a nice biological trick.
00:54:08.000 So is food.
00:54:09.000 Tastes great, good for you.
00:54:10.000 You'll stay alive.
00:54:11.000 You need nourishment for your body.
00:54:13.000 Smell it.
00:54:14.000 That's not why you love it.
00:54:16.000 Right.
00:54:17.000 But you love it because of these human rewards.
00:54:19.000 No, no, no, no.
00:54:19.000 That's a simplistic explanation.
00:54:21.000 Sure it is.
00:54:22.000 That it's not explaining the subjective reality of what it feels like.
00:54:26.000 Of course it's not.
00:54:27.000 Fasting a lot.
00:54:27.000 But it's also true.
00:54:28.000 After fasting, eating a good steak.
00:54:29.000 Oh, yeah.
00:54:30.000 No, no.
00:54:30.000 You can't explain that with a body.
00:54:32.000 Being in the cold and taking a hot shower.
00:54:34.000 Okay.
00:54:34.000 When you're in the cold for days camping and you take a hot shower, it's the greatest feeling you'll ever have in your life.
00:54:38.000 Yeah, you can do an evolutionary biology explanation, but you can reduce every beautiful human experience to a biological explanation, but I think you actually lose a lot of the things that aliens are jealous of.
00:54:55.000 I don't think aliens are jealous.
00:54:56.000 I think they got rid of that part.
00:54:58.000 That's the point.
00:54:59.000 Jealousy is another beautiful aspect of the human condition.
00:55:02.000 It's beautiful for us.
00:55:04.000 I'm not saying it's not beautiful.
00:55:06.000 It's beautiful for us.
00:55:08.000 It does create things that we currently enjoy.
00:55:11.000 We enjoy art.
00:55:13.000 We enjoy expression.
00:55:14.000 We enjoy a painful song.
00:55:16.000 Like when Janis Joplin sings Peace of My Heart, you can hear the pain in her voice.
00:55:23.000 You can hear it.
00:55:24.000 I mean, you can relate to that when you're listening to it, that incredible voice you had.
00:55:32.000 That's that woman's essence coming out and the sound that she made with her mouth.
00:55:37.000 For us, it's amazing.
00:55:39.000 Speaking of her mouth, she broke the heart of Leonard Cohen after she gave him head.
00:55:44.000 She broke his heart?
00:55:45.000 Yeah.
00:55:45.000 How'd she break his heart?
00:55:47.000 Well, he fell in love with her.
00:55:51.000 She didn't want to be with him.
00:55:53.000 Oh, poor Leonard.
00:55:56.000 Cry me a river.
00:55:57.000 Janis Joplin blew ya.
00:55:58.000 Move on.
00:55:59.000 He got a great story.
00:56:00.000 He wrote a good song about it, too.
00:56:01.000 Did he?
00:56:02.000 Chelsea Hotel No.
00:56:03.000 6, I think it's called.
00:56:05.000 He gave me head something.
00:56:07.000 I forgot how the...
00:56:10.000 And I think she said, which was not very nice, that he was a bad lay.
00:56:15.000 Oh.
00:56:16.000 Well, maybe it was.
00:56:18.000 Yeah, well, you know.
00:56:20.000 When someone make a good song about Head, I remember you well in the Chelsea Hotel.
00:56:25.000 You were talking so brave and so sweet.
00:56:27.000 Giving me Head on the unmade bed while the limousines wait in the street.
00:56:31.000 First of all, someone's a bit of a chatty Cathy.
00:56:35.000 Okay?
00:56:35.000 How about keep that fucking story to yourself, bro?
00:56:38.000 No, but you didn't say it's Janis Joplin.
00:56:39.000 Yeah, but everybody's gonna know.
00:56:41.000 You don't need to shame Janis Joplin.
00:56:43.000 She's a nice lady.
00:56:44.000 Why is it shameful?
00:56:45.000 Are you slut-shaming?
00:56:46.000 Sucking dicks in some fucking weird hotel?
00:56:48.000 Sucking dicks.
00:56:49.000 It's one dick, and it's romantic.
00:56:50.000 And it's not some hotel.
00:56:52.000 It's in New York.
00:56:53.000 What are you talking about?
00:56:53.000 Oh, it's better.
00:56:54.000 It's better because it's in New York.
00:56:55.000 Don't suck dick in Detroit.
00:56:57.000 Hold it.
00:56:57.000 Just keep it together until you land on the East Coast.
00:57:00.000 Alright, we've been watching too much John Wick, and I don't know if Santa will be...
00:57:04.000 Which also takes place in New York.
00:57:05.000 Not even Send of a Woman.
00:57:07.000 What does he drive?
00:57:08.000 Like a Lamborghini or Ferrari in that?
00:57:10.000 Which movie?
00:57:11.000 Send of a Woman.
00:57:12.000 Is the Devil driving?
00:57:13.000 Is he the Devil?
00:57:14.000 No, he's not the Devil.
00:57:15.000 You're thinking of like...
00:57:16.000 I think of Al Pacino!
00:57:20.000 Yelling!
00:57:21.000 Gotta yell!
00:57:23.000 That's actually, I think, the first movie he did the hoo-ah thing in.
00:57:27.000 Hoo-ah!
00:57:28.000 Hoo-ah!
00:57:29.000 Yeah, Son of a Woman.
00:57:30.000 Al Pacino absolutely deserves his Oscar.
00:57:34.000 You love that movie.
00:57:35.000 Well, no, I love a lot of movies.
00:57:36.000 I just love talking shit because you said that movie sucks.
00:57:39.000 Did I say it sucks?
00:57:40.000 You said like, meh, because I think I compared it to...
00:57:42.000 You said it was better than something, I think.
00:57:44.000 Than John Wick.
00:57:45.000 Well, I want you to know that if you compare movies with me, I will just say whatever I think would be funny to say.
00:57:52.000 I don't really...
00:57:53.000 I mean, if you want to break it down, I'd have to watch that movie and go over it, whether or not I enjoyed it or not.
00:58:00.000 Okay.
00:58:01.000 I don't understand humor.
00:58:02.000 It's called talking shit, sir.
00:58:03.000 Oh, he's got a Ferrari.
00:58:04.000 That's a nice one.
00:58:05.000 That's like a Magnum PI Ferrari.
00:58:06.000 He's blind driving it.
00:58:07.000 Oh, good idea.
00:58:09.000 Shut this fucking thing off.
00:58:11.000 Get the fuck out of here.
00:58:12.000 The aliens would be like, whoa, you're going to get your thrills out of driving blind.
00:58:16.000 Tell him he's driving.
00:58:17.000 Sit him down in a chair, give him a fucking wheel.
00:58:22.000 Dude, you're driving so good.
00:58:23.000 It's incredible.
00:58:24.000 You're the best driver ever.
00:58:28.000 I get what you're saying.
00:58:30.000 I agree with you.
00:58:30.000 About the movie, about aliens?
00:58:32.000 About aliens.
00:58:32.000 I get what you're saying about what's beautiful about being a person.
00:58:35.000 It's beautiful to us.
00:58:37.000 But I think this is, if I had to guess, and this is just pure speculation, I think this is a stage of evolution that's very crazy.
00:58:46.000 It's very wild.
00:58:48.000 It's very chaotic.
00:58:49.000 But it's this weird stage in the combining of A kind of intelligence that has emerged out of human creativity.
00:59:01.000 And become much more powerful than humans and has the ability to control humans and has the ability to make its own physical objects.
00:59:10.000 Has the ability to improve upon itself.
00:59:13.000 And it won't think anything of keeping us around.
00:59:18.000 We might be, like what Marshall McLuhan said, he said, we're the sex organs of the machine world.
00:59:24.000 Which is one of my favorite quotes ever.
00:59:26.000 It's such a good quote.
00:59:27.000 Except the sex organs, you might want to keep those around for a while.
00:59:31.000 And it's possible that they don't want to keep us around.
00:59:35.000 Keep the zebra in the zoo.
00:59:37.000 Yeah.
00:59:37.000 Because you can have an artificial zebra.
00:59:40.000 Keep them alive.
00:59:41.000 Gotta keep them in the zoo.
00:59:42.000 I mean, zoos are the most horrific thing ever for a fucking animal.
00:59:46.000 The only animal I used to do a joke about, the only animal that doesn't have a bad time in the zoo is giraffes.
00:59:52.000 They're so chill because no lions are eating them.
00:59:54.000 It's a beautiful day for them.
00:59:56.000 But everybody else is like, get me the fuck out of here.
00:59:58.000 I don't want to be in this shitty little thing where people stare at you.
01:00:01.000 Yeah, well, maybe Earth is a kind of zoo, and then we're in it, and then we're being observed.
01:00:07.000 And maybe all the suffering is a kind...
01:00:11.000 There's probably activist aliens, they're saying, why keep the humans, these conscious beings that are capable of so much suffering, why allow them to continue suffering?
01:00:21.000 I mean, that's the question, the religious question people ask.
01:00:23.000 Why does God allow suffering?
01:00:25.000 Right.
01:00:25.000 Why is there evil?
01:00:27.000 Why is there injustice?
01:00:28.000 I think all of these questions are really good questions, but we look at it through the eye of culture.
01:00:35.000 We look at it through the eye of what's meaningful for us, what life means to us.
01:00:40.000 But if you could look at it almost like a computation, if you could step away It's impossible for us to do it, but if you just had to pretend, if you could step away and look at it like this thing is moving in a certain way,
01:00:56.000 like what is it doing?
01:00:57.000 Well, it's making better stuff.
01:00:59.000 That's all it does.
01:01:00.000 All it does is make better stuff.
01:01:01.000 It has a lot of things in there like romance and sound and stories and the hero's journey, but what is it really doing?
01:01:10.000 Input is energy.
01:01:11.000 It's making stuff.
01:01:12.000 Output is better stuff.
01:01:14.000 It's making better stuff.
01:01:14.000 But he needs energy, so he needs the input.
01:01:16.000 And it's recently addicted to stuff.
01:01:19.000 Recently addicted to electronic stuff where you have to carry around this thing with you.
01:01:23.000 So this thing has got this parasitic relationship with you.
01:01:26.000 And you need a new one every year.
01:01:28.000 You need a better one because the better one came out.
01:01:30.000 Oh, what's the better one do?
01:01:31.000 The better one has a better camera.
01:01:32.000 Okay.
01:01:33.000 So this keeping up with the Joneses, which seems to be a part of like just natural human behavior patterns, like people always want to keep up with their neighbor, right?
01:01:43.000 Well, the thing that fuels this technological innovation is all materialism.
01:01:50.000 Materialism fuels it because you have to get the latest, greatest stuff.
01:01:54.000 Like, you know, you're gonna have a laptop from four or five years ago, you're not gonna notice.
01:01:57.000 Yeah.
01:01:58.000 You know, you're not gonna fucking notice.
01:01:59.000 You're gonna have a phone from, like, I have one of my iPhones is an iPhone 11. I don't notice.
01:02:04.000 I make calls, take pictures, looks great.
01:02:07.000 Get on the, answer emails, looks great.
01:02:09.000 It's a fucking, I just keep it around just to see how long I'm gonna get mad at it.
01:02:13.000 I don't notice a difference.
01:02:16.000 It is an open question whether that's a permanent state of affairs at this point, this kind of capitalist materialistic pursuits, or that's a temporary stage.
01:02:25.000 That's what Karl Marx thought, that capitalism is a temporary state.
01:02:28.000 The ultimate place to be is perfect communism, pure communism.
01:02:34.000 Well, I don't think that works with humans because I think part of what makes us achieve and do these things and even make life better and safer for everybody is we're constantly looking to do better than the people before because you get rewarded for doing better.
01:02:48.000 Yeah, competition.
01:02:49.000 It's very important.
01:02:51.000 It's everything for what this thing is doing.
01:02:56.000 Competition is everything for it.
01:02:57.000 If you don't have any competition at all, no competition, And everyone just has money and we all just sit around and wait.
01:03:04.000 And there's no need for innovation because you can't get ahead.
01:03:07.000 There's no need for creating a new Apple because you don't make any money doing that.
01:03:12.000 You're not going to do it.
01:03:13.000 Those folks that are working at Google right now that are doing 16-hour days or people that are working to try to refix Twitter that are working constantly, the people that are working at SpaceX, if they were making no money, they wouldn't do that.
01:03:26.000 If they didn't have to do it, they wouldn't do that.
01:03:28.000 I disagree with that.
01:03:30.000 I'm a big proponent of capitalism, but I think the motivation of a lot of those engineers is not money.
01:03:37.000 But to fund it?
01:03:39.000 But yes, there's a bunch of stuff that's an output of capitalism that enables those engineers to do incredible work.
01:03:47.000 So yes, to fund it, that whole mechanism.
01:03:50.000 Also, there's something about centralized control, which is required by at least socialism, that creates bureaucracy that slows down entrepreneurship and innovation.
01:04:01.000 Yeah, for sure.
01:04:02.000 But I just, I don't know if the people, even like billionaires, that seems to be like a bad word.
01:04:09.000 I think people think that they're in the tech sector motivated by money.
01:04:14.000 I just, I know a lot of them.
01:04:15.000 I don't see it.
01:04:16.000 Sometimes they fall in love with the things that money will bring later on.
01:04:22.000 They enjoy whatever benefits of that, cars, houses, and so on.
01:04:26.000 But they're not motivated by it.
01:04:29.000 But if you're going to fill Google, how many employees work at Google?
01:04:34.000 You got thousands.
01:04:35.000 Yeah, tens of thousands.
01:04:36.000 They're not going to put in those 16-hour weeks unless they have to.
01:04:39.000 Well, I can push back on that.
01:04:41.000 Don't you think they're doing that because they have a great opportunity to make more money and to advance their career, and while they're 27 years old and they're doing these 16-hour days, they're hoping for some sort of a return on this investment of time and effort.
01:04:56.000 In the modern state of Google and so on, I think the people that are doing most breakthrough work, like the 10x contributors.
01:05:05.000 That's the other secret, I think, of those companies.
01:05:07.000 Some people are just kind of doing a job and some people are really pushing the limits.
01:05:12.000 So some people are working and they're facilitating all the stuff that needs to go on in the background and keep the company running?
01:05:20.000 As the bigger the company gets, and you see this with Elon, Elon firing a large percentage of the people at Twitter, most people just kind of get complacent and comfortable and so on.
01:05:29.000 That large companies, especially if there's a profit coming in, it's like, what exactly is the motivation for you?
01:05:36.000 Because you feel like a cog in a wheel.
01:05:38.000 Yeah.
01:05:38.000 And the people that really step up, usually they're going to be in smaller companies, like in startups, to where it's clear where my ambitious contribution can actually bring impact to the world, but none of that is money.
01:05:53.000 If there was no way to make money doing that, you don't think some of those people would drop off?
01:05:59.000 Yeah.
01:06:00.000 I think money is a component, of course.
01:06:02.000 It's a component, right?
01:06:03.000 But the fuel...
01:06:05.000 And I don't know if there's something special about tech, about the brain of the people that do technology.
01:06:11.000 It's almost like playing games.
01:06:13.000 They would play chess no matter what.
01:06:14.000 Yeah, they're the tinkerers.
01:06:16.000 And it just so happens that tech brings billions of dollars.
01:06:19.000 But if you look at Olympics, right?
01:06:21.000 The Olympic Games, chess is a beautiful example.
01:06:23.000 Nobody makes money playing chess, but there's a huge community of chess players that dedicate every ounce of their being to improving a chess.
01:06:34.000 And it's a really good example because it's a similar kind of brain that is attracted to tech.
01:06:40.000 There's limitations to that kind of brain because it often lacks empathy and basic desire to understand human nature and human beings and so on.
01:06:48.000 They just want to be tinkering and building puzzles.
01:06:50.000 I love that one dude who he cheats.
01:06:53.000 And he's kind of like openly cheated, but he's also really good at chess.
01:06:57.000 That 19-year-old kid that he might have when he beat...
01:07:01.000 Who did he beat?
01:07:03.000 Magnus Carlsen.
01:07:05.000 Yeah, Hans.
01:07:05.000 Yeah, that guy.
01:07:06.000 That's a fascinating story.
01:07:08.000 Because you would think that someone who cheats sucks.
01:07:12.000 But no, he's actually really good at chess, and also he cheated.
01:07:15.000 Like a gang of times.
01:07:16.000 And his mentor cheated too.
01:07:18.000 Right.
01:07:18.000 And he cheated to try to get a higher rating online, and he openly admits it, and you're like, Jesus, what did he do with this?
01:07:23.000 When he was younger.
01:07:24.000 Yeah.
01:07:24.000 But not that young.
01:07:27.000 No, young, 13 or whatever.
01:07:28.000 No, no, no, no, no, no.
01:07:29.000 I think the most recent one was a little later than they initially thought.
01:07:34.000 Sure.
01:07:35.000 The evidence there is complicated, but it's similar to steroids.
01:07:40.000 People that take steroids when they're competing in sports, they're already the elite.
01:07:45.000 Right, but here's why it's not similar.
01:07:47.000 Because if you're cheating in a game, you're using something, anal beads, whatever it is, that's allowing you to make better moves.
01:07:58.000 That is so much different than if everyone's doing steroids.
01:08:02.000 Because if you're doing steroids because everyone's doing steroids, everyone is cheating, so there's no cheating.
01:08:09.000 But if one person is just using their brain and the other person is using some sort of a calculation and getting some sort of a signal, we don't even know if that was real because it was never proven, right?
01:08:18.000 He might have just beaten them.
01:08:20.000 Like, Magnus might have gotten off to a bad start.
01:08:22.000 That's the thing.
01:08:22.000 This guy can actually play chess.
01:08:25.000 Which is kind of crazy.
01:08:25.000 And he's chaotic, creative, and so on, so it's hard to know.
01:08:28.000 He's not a standard chess player.
01:08:30.000 Fascinating.
01:08:31.000 Yeah.
01:08:31.000 It's fascinating when there's someone like that that doesn't fit into what you think.
01:08:35.000 There was a guy who was beating a lot of people online, and some people were saying, hey, this guy's cheating.
01:08:42.000 I want to play him in person.
01:08:44.000 And then they had a match, and they set it up and played him in person, and he was terrible.
01:08:47.000 He made all these mistakes.
01:08:48.000 And everyone's like, I knew it.
01:08:49.000 You're cheating.
01:08:50.000 Like, that's the standard.
01:08:52.000 We're used to that.
01:08:53.000 But in this case, no.
01:08:54.000 This guy's a wizard of chess.
01:08:56.000 Like a straight-up killer.
01:08:57.000 And also he cheats.
01:08:59.000 Or he cheated.
01:09:00.000 To me, it's fascinating because whether it's anal beads or anything else, it's like Cyborg is expanding your capabilities with technology.
01:09:10.000 That's cheating.
01:09:11.000 It's not even expanding your capabilities, right?
01:09:12.000 No, it's 100% cheating.
01:09:13.000 But for example, if I had something, whether it's up my ass or in my ear right now, and it was using ChatGPT, like you asked me of an explanation of the war between Russia and Ukraine, and I would just tune in to the ChatGPT explanation and just give you that explanation,
01:09:31.000 right?
01:09:31.000 I think that's really interesting to me, how to expand human capabilities.
01:09:37.000 Because you have to understand, because there's a lot of dangerous trajectories that could take possibly.
01:09:42.000 Like I built the, I did the chess playing thing, not with anal beads, but there is, for people who are curious, I discovered this.
01:09:49.000 This is fascinating.
01:09:51.000 There's quite a lot of anal beads and butt plugs and sex toys that are Bluetooth connected.
01:09:55.000 It's very, and they have Python APIs.
01:09:57.000 So if you're curious, you and your girlfriend.
01:09:59.000 There's quite a lot of them?
01:10:00.000 Yeah.
01:10:01.000 Yeah, so you can program.
01:10:03.000 There's a...
01:10:03.000 I think it's actually called on GitHub Buttplug.
01:10:07.000 Can you have them, like, move to music?
01:10:12.000 A tip?
01:10:12.000 Wait, I'm sorry, what?
01:10:14.000 It's a tip's music?
01:10:15.000 What do you even move to music?
01:10:16.000 So there's like...
01:10:17.000 Yeah, 100%.
01:10:19.000 I don't know if your ass or vagina could feel that.
01:10:25.000 I don't know.
01:10:26.000 I have not investigated any of this, but clearly a lot of people are.
01:10:29.000 Imagine if it syncs to a song.
01:10:31.000 You're masturbating to Inagata Naveedin.
01:10:33.000 That's pretty easy, actually, to do.
01:10:37.000 Yeah.
01:10:38.000 This is something you would 100% do.
01:10:41.000 But I don't know how much interest it is.
01:10:44.000 So you could somehow or another use that to send a signal that would tell you knight to whatever, rook, four, whatever.
01:10:50.000 I didn't use that.
01:10:52.000 I just used there's a bunch of devices that can vibrate.
01:10:54.000 They're just like a size of a quarter.
01:11:00.000 And so I played with that.
01:11:02.000 I don't even know how they theorize.
01:11:07.000 If someone's playing chess and they have an anal bead that gives them signals, how could it even tell you how to move your pieces around?
01:11:13.000 What kind of a bizarre code would you have to...
01:11:16.000 Well, I'm glad you asked, Joe.
01:11:20.000 There's an answer to this.
01:11:22.000 No, so for a beginner like me, so just like a mediocre player like me, you would use a lot of information like Morse code.
01:11:30.000 You would say, take this piece, so it's the position of that piece, and move it to here.
01:11:34.000 That's a lot of vibration.
01:11:36.000 For a Grandmaster level player, all you need is a very low resolution signal about even just the information of there exists a move here that's not standard, that's going to be very strong.
01:11:54.000 So that sends a Grandmaster signal to think about this position.
01:11:59.000 Like, there's obvious moves and there's non-obvious moves.
01:12:02.000 I'm just giving you examples of like a Grandmaster needs a much fewer signal.
01:12:07.000 I see what you're saying.
01:12:08.000 So I would do a most normal place.
01:12:10.000 We need to be buzzing like crazy.
01:12:12.000 But so with Morse code, there's a lot of different ways to compress.
01:12:16.000 Like if you want to get good at this, it's actually, I forget how many bits of data are needed, but it's very little.
01:12:21.000 But if the easy one is Morse code to just send you the position of the piece.
01:12:27.000 The interesting thing that I have not tested, and the audience, the few people in the audience that want to test this, is a lot of the vibrating devices have different settings, 0 to 20. I wonder how sensitive you are to be able to tell the difference between the settings.
01:12:43.000 You can kind of like hover the piece of like warmer, warmer, colder, colder.
01:12:48.000 So you can have information.
01:12:49.000 I don't know if you can get information from the different intensities, or does it have to be binary 0, 1?
01:12:54.000 They weren't playing speed chess, were they?
01:12:57.000 No, no, no.
01:12:58.000 It's the classical game, so you can wait as long as you want.
01:13:00.000 Yeah, maybe you could kind of like hover.
01:13:06.000 No, because the way he would cheat is I think it would go, the games were delayed by a few minutes, I think, so you can't hover.
01:13:16.000 You just have the current state of the chessboard.
01:13:20.000 Right.
01:13:20.000 Because you have to have the video stream of the chessboard.
01:13:22.000 You have to somehow, it's two-way communication.
01:13:24.000 You have to communicate to the AI, to the game-playing engine, to the chess engine, what is the state of the current board?
01:13:31.000 What was the move of your opponent?
01:13:32.000 Right.
01:13:32.000 Right, and so is there an overhead camera that allows, so, and the streamed?
01:13:37.000 Yes, the streamed, yeah.
01:13:38.000 Well, the solution to that would be a delay, right?
01:13:40.000 Yeah, but there's also probably other ways to, like, you can probably send signal on your body somehow by tapping and so on, what the opponent did.
01:13:50.000 I don't know exactly how you, you know, I think protecting against cheating for over-the-board chess, which is in-person chess, I think is pretty easy.
01:14:00.000 They just have to take effort to do that.
01:14:02.000 Like they scanned him, which I think if he didn't cheat is kind of embarrassing, but it's also awesome.
01:14:08.000 So I think he brought a lot of attention to chess.
01:14:12.000 Yeah, definitely.
01:14:13.000 Which is wild.
01:14:14.000 Like a lot of people were paying attention to it because it was a scandal.
01:14:17.000 That's what we like.
01:14:19.000 Regular chess is boring.
01:14:20.000 We want a scandal!
01:14:21.000 It's not just a scandal.
01:14:22.000 I mean, they kind of are looking for the Bobby Fischer, for the young American, wild type of character who might be a genius, who might not actually be cheating.
01:14:31.000 There might be some brilliance here, beating the best person in the world, Magnus Carlsen, over the board.
01:14:36.000 Well, he's beaten some really good players before, right?
01:14:40.000 Yes, but not as...
01:14:42.000 He had a meteoric rise.
01:14:45.000 So I think he's beaten some very good players, but mostly people know he's not as good as...
01:14:51.000 He's not as good as somebody who can regularly beat Magnus Carlsen.
01:14:55.000 Also, it's possible he got into Magnus' head because I think Magnus believes him and has believed that he's a cheater for a long time.
01:15:02.000 He really hates cheaters.
01:15:04.000 And so it's possible there's a...
01:15:07.000 The same with you.
01:15:08.000 You hate people that steal jokes, right?
01:15:10.000 Comedians hate people.
01:15:12.000 You might not have a normal interaction with a person that's suspected of having stolen a joke in the same way he might have gotten in his head.
01:15:20.000 He resigned on his first move the next time they played, which is wild.
01:15:26.000 Yep.
01:15:27.000 Well, it's a good signal to say I'm not going to play with cheaters.
01:15:29.000 But it could be also, there could be a bunch of forces that play there because Chess.com sponsors Magnet.
01:15:38.000 Every single kind of field has their like...
01:15:41.000 Yeah, it has their centralized organization that has its interest, financial interest.
01:15:48.000 There's the controversial figure.
01:15:50.000 I mean, the dynamic of drama plays out in the same kind of way in all these different fields.
01:15:58.000 But it's still pretty interesting to think.
01:16:01.000 Because we're living in reality, and this is going to happen in all kinds of interactions, where We already have AI chess engines that are way better than humans.
01:16:09.000 So how do you still enjoy the game of chess while there's a system out there that's way better than humans?
01:16:14.000 Well, because you're enjoying two people competing.
01:16:19.000 But you're not enjoying just the movement of the board being the most efficient.
01:16:24.000 You're enjoying watching someone's thought process while they're figuring it out.
01:16:29.000 Yeah, but...
01:16:30.000 That's what it is.
01:16:32.000 For sure, for sure.
01:16:33.000 But it's still not as magical as before when we thought chess was like the epitome of human intelligence.
01:16:41.000 Now you're like, yeah.
01:16:42.000 It still is the epitome.
01:16:44.000 Well, Go is more complex anyway, right?
01:16:46.000 And now computers can beat the best Go players.
01:16:50.000 Yeah.
01:16:51.000 Which is really wild.
01:16:52.000 It's wild, but you still lose some of the magic when computers can do it.
01:16:55.000 Oh, sweetie.
01:16:56.000 You're such a romantic.
01:16:57.000 It's so cute.
01:16:58.000 Well, for sure.
01:16:59.000 Okay, imagine...
01:17:01.000 I don't know.
01:17:01.000 Back to sex toys.
01:17:03.000 Imagine a vibrator could please a woman 1,000 times better than another human can.
01:17:09.000 Yeah, don't be selfish.
01:17:10.000 Let her use the vibrator.
01:17:13.000 But I'm telling you, there's...
01:17:15.000 Exactly.
01:17:17.000 Exactly.
01:17:18.000 So some of the magic is gone of human-to-human interaction.
01:17:22.000 Right, but the magic is only magic to us because we're dumb.
01:17:25.000 Yeah, but you call it dumb.
01:17:27.000 Yeah, currently dumb.
01:17:29.000 Limited cognitive capabilities that enable the appreciation of the human condition.
01:17:35.000 Yeah.
01:17:36.000 Well, I am a firm believer that there's beauty in the world.
01:17:40.000 And I'm a firm believer in the beauty being in the eye of the beholder.
01:17:45.000 And that human beings that find more beauty in things are inherently more fascinating and interesting and attractive.
01:17:52.000 Yeah.
01:17:52.000 But...
01:17:54.000 If you looked at it as a calculation, like, what's it doing?
01:17:57.000 It's moving into the next stage of existence.
01:18:00.000 Do you think that whatever we used to be, Australopithecus, do you think that they would make fun of the people with shoes on?
01:18:07.000 Look at these losers with fucking shoes on.
01:18:10.000 Think you're smart.
01:18:11.000 Think you're smart with your clothes, living in a house.
01:18:13.000 Like, they probably longed for the good old days when they were running from Jaguars.
01:18:17.000 You know, they probably longed for the good old days when they didn't know shit.
01:18:21.000 And now all of a sudden they're using agriculture and trying to figure out when the storms are coming.
01:18:27.000 So much work.
01:18:28.000 So much better when we just got eaten by lions.
01:18:30.000 So much better when we're just running around, sleeping on the ground.
01:18:34.000 Most people don't survive.
01:18:36.000 You have to fuck as much as possible because you've got to make kids because they're all going to get eaten.
01:18:41.000 Yeah, but again, stories like Brave New World paint an end point to this trajectory that's not good.
01:18:48.000 There could be an optimal place where you stop, right?
01:18:51.000 Of course, it's tempting to say now we're in the optimal place, but it's not obvious to me.
01:18:56.000 For example, there's many brilliant people that are working to extend life, right?
01:19:01.000 Yes, extending the quality of life, improving the quality of life is a really worthy pursuit.
01:19:06.000 It's an obvious pursuit, and it should be...
01:19:08.000 I mean, it's fascinating, it's a beautiful one we should invest in, but do you want to live forever?
01:19:13.000 To me, a lot of people say, like, yes, you should be able to choose when you die, but to me, it's not obvious that living forever is going to maximize happiness.
01:19:25.000 There could be death, the fear of death, the finiteness of things, the finiteness of experience that are pleasurable is part of the human condition.
01:19:35.000 It's not obvious to me if you remove that, that that's not going to significantly decrease the amount of happiness.
01:19:44.000 Well, it will decrease the amount of happiness.
01:19:45.000 It's like, have you ever played a video game on God mode?
01:19:48.000 Yeah, exactly.
01:19:49.000 It's boring as fuck.
01:19:50.000 Yeah.
01:19:50.000 What is that?
01:19:50.000 Just running around shooting everything, because there's no consequence.
01:19:53.000 Yeah.
01:19:53.000 Yeah, there's no consequence.
01:19:55.000 We desire consequence.
01:19:59.000 What we're doing is dealing with These instincts, this coding, behavior patterns of civilization and of organisms that have, you know, been evolving and have been working their way out to get to the most efficient and best method possible for fucking millions of years.
01:20:22.000 You know, I mean, what we're going to do is continue that process.
01:20:27.000 I think we should just enjoy what we enjoy right now.
01:20:32.000 We should be very appreciative of the fact that we haven't made that transition yet.
01:20:37.000 And I think we're probably the last of the Mohicans.
01:20:39.000 We're probably the last of the regular people.
01:20:41.000 And I don't think we're going to be able to look back a hundred years or a thousand years from now and say that this was better If they solve all the problems that wreck havoc on people's lives emotionally and psychologically and in terms of like war and famine and disease and all the problems we have with Poverty and slavery and resources of the earth being
01:21:12.000 exploited by a select few and damaging the environment and the process.
01:21:17.000 Like all these things that we know are absolutely wrong about what human life is capable of even today in 2023. We could eliminate all of that.
01:21:25.000 We would.
01:21:26.000 And we will.
01:21:27.000 I think that's what we're going to be.
01:21:29.000 We're going to be some new thing.
01:21:30.000 It might not be as beautiful.
01:21:34.000 Well, maybe not on Earth.
01:21:36.000 The interesting thing about expanding out to other planets is that life will be extremely harsh on those planets.
01:21:41.000 So that explorer experience where resources are highly constrained, extremely challenging, so building up a civilization on other planets, that might have the same kind of romanticized humanness that we're talking about now.
01:21:56.000 Here on Earth, everybody will be just like in a pool of pleasure.
01:22:02.000 Just, you know, connected to a VR where just they're constantly just doping me in everywhere.
01:22:07.000 Remember when The Matrix first came out and you're like, that's stupid.
01:22:10.000 Yeah.
01:22:11.000 When was that?
01:22:12.000 90s?
01:22:12.000 I think it was 90s.
01:22:14.000 99?
01:22:15.000 You say?
01:22:16.000 Somewhere?
01:22:16.000 Yeah, in that range.
01:22:18.000 And everyone was like, oh, so silly.
01:22:20.000 That could never happen.
01:22:21.000 Now you look at it and you're like, oh, that could 100% happen.
01:22:25.000 Like, how many thousands of years from now?
01:22:27.000 Before it has to be like that?
01:22:29.000 But if you're giving people the option to live a completely free life...
01:22:35.000 Where you're the hero, you're the bad motherfucker, you're riding a motorcycle with your shirt open and you get all the girls and you're fucking shooting guns at the sky and the UFOs come and they take you on a trip and every day is wild and magical and you're running from tigers and you barely get away.
01:22:52.000 And you're like, you're gonna do it.
01:22:54.000 You're gonna do it the same way you play fucking Battlecraft or whatever the fuck they play.
01:23:00.000 What's that big game they play?
01:23:01.000 What do they play?
01:23:02.000 Call of Duty?
01:23:02.000 Call of Duty.
01:23:03.000 Starcraft?
01:23:04.000 Starcraft, that one too.
01:23:05.000 All that shit.
01:23:07.000 All that shit that people are...
01:23:08.000 Battlefield Earth, whatever.
01:23:11.000 Best movie ever.
01:23:12.000 All that shit that people play.
01:23:14.000 What are they doing?
01:23:14.000 It's more fun than regular life.
01:23:17.000 If you have a boring-ass life, and instead, you could be sniping people from a rooftop and winning, and jumping off the top of a building and landing in a canopy, and you live.
01:23:29.000 This is wild.
01:23:30.000 You're doing this fun thing that's very exciting, whereas regular life is not exciting.
01:23:36.000 We're so easily manipulated in that way.
01:23:39.000 We're so easily stimulated and we're willing to give up a giant percentage of our time already to these things, whether it's to our phone or whether it's to video games that we play and Xbox and PlayStation and all the shit that people are just addicted to all day long.
01:23:55.000 It's not much different to go from that to the next level, to just be completely integrated with technology.
01:24:02.000 And I think it's inevitable.
01:24:03.000 I think it's just a matter of time.
01:24:05.000 I don't know how many years, but I think we're gonna look back on these years of fucking riots in the streets, cops killing people, and we're gonna go, God, we were so dumb back then.
01:24:15.000 We're so concerned with romance and meanwhile was all this suffering and all this hate and all this jealousy and anger and all this misdirected rage and now it's all gone.
01:24:26.000 And now people work together to like create symbiosis and balance on earth with all the natural elements, the plants and the trees and the water and we live in a carbon neutral way.
01:24:40.000 Yeah, I mean, it does seem that the chaos is a side effect, like a recent one, because of social media, because of all of us being connected.
01:24:48.000 Does it not feel like somehow different than the early, like even like around 9-11, it just feels like the drama, the tension wasn't there.
01:24:57.000 It wasn't there like this.
01:24:59.000 Like this.
01:24:59.000 So this is some weird chaotic state that we're trying to figure out.
01:25:02.000 And I think it's obvious to me that same mechanisms that enable this kind of drama on social media will lead us to connect on a deep level as humans in a positive way.
01:25:15.000 Social media, I know it's cliche to say, but that's what they dream about.
01:25:19.000 Even Facebook and all of them, they want to connect people and discover cool people, cool communities.
01:25:26.000 You learn stuff, you grow, you challenge yourself, you meet friends, meet people, you fall in love with all of that.
01:25:35.000 And just have an enriching life to where if you use a piece of social media, TikTok is a little better at this.
01:25:41.000 When you're done using it, you feel a little bit better than you did before.
01:25:45.000 Really?
01:25:46.000 You don't feel like a loser when you scroll through videos all day?
01:25:49.000 No, the problem TikTok does is it made it so addicting.
01:25:53.000 That you don't want to look away.
01:25:54.000 I think you feel like a loser because you've looked at it for a very long time versus, I'm referring to more like it's more, it's less, the virality of TikTok spreads drama less,
01:26:12.000 I would say.
01:26:13.000 Right, than like Twitter, which is all drama.
01:26:16.000 Yeah, it's not all drama, but it, It's a lot of fucking drama.
01:26:20.000 It's a lot of drama and the drama somehow spreads faster than on other networks.
01:26:25.000 It's interesting when you see narratives and that those narratives are not accurate and you see narratives that get pushed Like one we were talking about earlier today was the Paul Pelosi video.
01:26:38.000 Do you see that video?
01:26:40.000 Do you see the Paul Pelosi get hit with the guy with the hammer?
01:26:44.000 Wait, there's a video of it?
01:26:45.000 Yeah, there's a video of it.
01:26:46.000 The cops, they released the cops body camera footage.
01:26:50.000 And then they also released the security footage that shows the guy breaking into the house.
01:26:55.000 And then they released the 9-11 call so you can listen to Pelosi.
01:26:59.000 So there was some suspicion that he knew the guy.
01:27:03.000 But I think that's just because people are suspicious of everything.
01:27:06.000 There's always like, what's really going on?
01:27:08.000 People always do that.
01:27:09.000 But you could really clearly hear from the 9-11 call that there's a crazy person in his house with a hammer, and he's trying to keep the guy calm.
01:27:16.000 And he thought the guy was pretty calm up until the moment where the cops came, and you see he has his hand on the hammer.
01:27:21.000 I mean, it's really fucked up.
01:27:23.000 Listen to this because it's kind of crazy.
01:27:24.000 Let's do it from the beginning.
01:27:26.000 Oh, wow.
01:27:26.000 But listen, watch this.
01:27:29.000 So he goes into his house.
01:27:30.000 So here's the guy.
01:27:32.000 He's got his hand on the hammer.
01:27:37.000 Drop the hammer!
01:27:39.000 Hey!
01:27:41.000 What is going on right here?
01:27:49.000 Okay.
01:27:49.000 It's horrible because you hear him snore like he's out cold.
01:27:53.000 It's really bad.
01:27:55.000 A crazy person broke into his house and attacked him.
01:27:59.000 Wow.
01:28:00.000 You see the guy?
01:28:01.000 The guy is smashing his door with a hammer.
01:28:06.000 That was really surprising for some reason.
01:28:08.000 He pulls up.
01:28:09.000 He's got a backpack and he pulls out a fucking hammer and just starts slamming at the door.
01:28:14.000 So he breaks into his house.
01:28:16.000 He seems so calm.
01:28:17.000 Look at him.
01:28:17.000 Oh, he's crazy.
01:28:19.000 He also just got out of jail.
01:28:21.000 So this guy is breaking his fucking door, smashing the window.
01:28:25.000 Look at this.
01:28:28.000 And he goes into his fucking house and hits him in the head with a hammer.
01:28:34.000 And so there was all this speculation that people knew him.
01:28:38.000 But that's what's fascinating, right?
01:28:39.000 That these narratives, like instead of people going, I don't know what the fuck's going on.
01:28:44.000 Everybody's like, I know what's going on.
01:28:46.000 He was doing this and he knew that guy.
01:28:48.000 Maybe they were doing drugs together.
01:28:50.000 Maybe they were doing this.
01:28:51.000 And it escalates.
01:28:53.000 Yeah.
01:28:53.000 I think it can start.
01:28:54.000 That's a fascinating thing to me is like a random anonymous person on the internet can even just ask a question.
01:29:01.000 Did they know each other?
01:29:03.000 Yeah.
01:29:03.000 And that somehow starts to build up.
01:29:05.000 It doesn't matter the story.
01:29:07.000 It starts to build up to where somebody swoops in and answers that question.
01:29:10.000 It's all like anonymous people.
01:29:13.000 And then somehow they can escalate and become viral.
01:29:16.000 That's what Jordan talked about.
01:29:18.000 This anonymity is dangerous in that way.
01:29:20.000 Because you can have the sociopaths of the world feed that narrative.
01:29:24.000 Well, not just that, but we must take into account that when you're seeing a quarrel online now, like when someone has a controversial opinion about something that a bunch of people are shaming him, those might not be real people.
01:29:38.000 There's a bunch of people that attack someone.
01:29:42.000 Someone has a controversial point, like let's say it's a point about Ukraine or something along those lines, something that's a very contentious issue.
01:29:50.000 You will read comments in this person's post and there's a percentage of those people that are responding that aren't real people.
01:29:59.000 I don't know what that number is, but it's not zero.
01:30:02.000 If there's any very viral tweet and it's a very controversial, polarizing subject, some of those people are fake people.
01:30:11.000 And how many of them?
01:30:12.000 And what are those fake people saying?
01:30:14.000 And are they muddying the waters of credibility?
01:30:17.000 Are they coming up with a false narrative to sway people that might be on the fence?
01:30:23.000 I don't know, but there's A percentage.
01:30:26.000 I mean, and this was one of the big contentious issues that Elon got into when he was buying Twitter, right?
01:30:31.000 Like, what percentage of those people are real people?
01:30:34.000 Because it's not 100%.
01:30:35.000 So is it, you know, you only have 5%?
01:30:39.000 Is that what you're saying?
01:30:39.000 Tell me how you got to that conclusion.
01:30:41.000 Well, I think, to me, the scary thing is that it doesn't actually take that many bots to influence and catalyze the spread of a narrative.
01:30:50.000 No, it doesn't take that many at all.
01:30:51.000 I feel like we're talking about in like less than 100 versus 100,000, which means inexpensively you can create narratives.
01:31:00.000 As long as there's a hunger and a suspicion about institutions, about individual politicians and so on, they could just pick up and create chaos.
01:31:11.000 Yeah, you could just start a rumor.
01:31:13.000 You could say a thing.
01:31:15.000 There's a lot of things you could do.
01:31:17.000 If you were a company, if you were a country, if you even were an individual that's obsessed with their own image, like think about what Bill Gates has spent To prop up certain media organizations like the amount of donations that he's given to media organizations and people thought like that might have been connected to favorable coverage of him.
01:31:36.000 Whether or not that's true, you could see how someone would do that.
01:31:39.000 If someone's worth billions and billions of dollars, let's just not even say him, let's make up a fictional person that's worth billions of dollars.
01:31:46.000 One way to curry favor with a bunch of people that are writing stories about you is to donate money to their organizations, exorbitant amounts of money.
01:31:54.000 And you can do that.
01:31:56.000 I mean, it's kind of what Sam Bankman-Free did with FTX. I mean, when you're the number two donor to the Democratic Party and then Maxine Waters is like, you know, I mean, I don't even know.
01:32:07.000 Why are we talking to him?
01:32:09.000 What did he do wrong?
01:32:10.000 What did she say?
01:32:11.000 What was her exact quote?
01:32:13.000 But she wasn't going to force him to come in, I think.
01:32:17.000 I think that's—I might be wrong, but I think that was the story, that she wasn't going to force him to come in and testify.
01:32:22.000 I'm like, what?
01:32:24.000 Like, he just—he made a Ponzi scheme and billions of dollars—like, they had an arena in Miami.
01:32:29.000 Like, this is wild shit.
01:32:31.000 This is not a small issue where, like, maybe he doesn't need to come in.
01:32:35.000 And then you find out how much money he gave to the Democratic Party.
01:32:38.000 Like, oh, God.
01:32:39.000 When all that unravels and you see how transparent it all is, how bonkers it is...
01:32:45.000 But it's still really difficult because what's the difference between SPF and Bill Gates?
01:32:50.000 So for the longest time, SPF, it was very hard to criticize him.
01:32:56.000 I think a lot of people had a positive look, a positive view of him.
01:33:00.000 Making the money?
01:33:01.000 No, not even like powerful people, in the crypto community and so on.
01:33:04.000 There's maybe a little bit of suspicion, but mostly positive.
01:33:07.000 If you look at Bill Gates now...
01:33:09.000 If I wanted to create a narrative right now, I would launch a bunch of bots making up anything about Bill Gates and it'll stick.
01:33:15.000 There's a lot of suspicion about Bill Gates.
01:33:18.000 The problem to me is, I'm not making any statements, but the problem to me is it's possible that Bill Gates has actually brought more positive to the world than almost any human being who's ever lived.
01:33:29.000 It depends on the conspiracy theories you believe.
01:33:32.000 The amount of funding he's invested in helping people in Africa, helping cure disease and malaria and so on is humongous.
01:33:42.000 It's sad to me that...
01:33:44.000 I'm not saying anything about Bill Gates, but it's sad to me if Bill Gates, if none of those conspiracies are true, most of them are not true, that we're attacking him and giving SPF a pass until SPF got really screwed.
01:33:58.000 Well, I think the only reason why they're attacking him was because, A, he was connected to the pandemic when it came to his support of vaccinations, and then, B, he had a formidable investment.
01:34:16.000 In BioNTech.
01:34:17.000 And that's something that he dumped recently before their stock plummeted.
01:34:22.000 He made a shitload of money.
01:34:24.000 I think he made like 10x on his return, something crazy like that.
01:34:28.000 So you could see that there'd be a financial incentive for someone like him to be promoting something and then profiting off that thing and then talking openly about that thing not being very effective and that there needs to be a new thing.
01:34:41.000 And so just that alone, you're embroiled in controversy now, and this is not taking us out at all.
01:34:49.000 Looking at all that, you could see easily why people would be mad at him.
01:34:53.000 The reason why people are mad at Sam Bankman-Fried is because people have always thought that crypto seems like nonsense.
01:35:01.000 Like, Bitcoin kind of makes sense, because there's only a certain amount of them, and there's a mysterious character that created it, and it's all geniuses, and it was kind of the first one that became popular in public, but all these other, like, weird crypto coins, and you're just making up, and they're worth this and that, and this guy's bought a fucking arena,
01:35:18.000 and like, what?
01:35:19.000 Well, so there's a lot of Frosters, definitely, but the people thought, in the 50s, thought the Beatles were full of shit, and the kids with their day, with the rock music, and Yeah, but the kids didn't think the Beatles were full of shit.
01:35:29.000 They enjoyed the music.
01:35:30.000 This guy's not with the Beatles.
01:35:33.000 Don't you fucking dare.
01:35:34.000 Not SPF. Not SPF. Cryptocurrency in general.
01:35:38.000 There's a lot of cryptocurrency projects.
01:35:40.000 Bitcoin, Ethereum, Cardano.
01:35:43.000 There's a bunch of them.
01:35:44.000 You should talk to some of them.
01:35:45.000 Oh, no, listen, I'm not saying that crypto is bullshit.
01:35:48.000 I'm saying a lot of people already have this idea that crypto is bullshit.
01:35:52.000 So when they see something like that fall apart, like when I talk to my comedian friends, like Tim Dillon and Giannis Papas.
01:36:00.000 Tim Dillon is the world expert in cryptocurrency.
01:36:02.000 So I'm glad you have him as a friend.
01:36:04.000 I'm not talking to him about whether or not crypto is good or bad and whether or not I should invest.
01:36:10.000 But when he starts making fun of these fucking dorks that are taking speed, fucking each other in a condo in the Bahamas, it's hilarious.
01:36:21.000 When you see how much fertile ground there is to mock this idea that these coins that you make up out of thin air, oh, it's worth a billion dollars, better buy it.
01:36:33.000 Like, what am I buying?
01:36:34.000 What am I buying?
01:36:35.000 Like, everybody already – they might be wrong, and I'm sure they are.
01:36:38.000 I'm sure it's complicated.
01:36:40.000 But that's why people are mad, because automatically you think it's bullshit, because it doesn't make any sense.
01:36:44.000 It's like when people talk about NFTs.
01:36:46.000 Like, Tom Segura and Christina Pazitsky from your mom's house, they put up like an NFT, and it's the only time I've ever read their comments where people are mad at them.
01:36:54.000 Where people are like, what the fuck is this?
01:36:56.000 You're just ripping people off.
01:36:57.000 This is bullshit.
01:36:58.000 Like, people have this attitude about these things.
01:37:01.000 Where they're like, this is kind of nonsense.
01:37:04.000 NFTs are different than crypto exchange.
01:37:07.000 So crypto exchange, SBF... NFTs are the same in that people don't understand them.
01:37:11.000 Yes, sure.
01:37:12.000 But crypto exchanges like Coinbase, for example, I mean, SBF... Committed fraud.
01:37:20.000 Fraud.
01:37:21.000 Like this is not, this is a really, it's not cryptocurrencies the problem.
01:37:24.000 Moving money around, stealing money, yeah.
01:37:27.000 And you could say that the kind of people...
01:37:29.000 Allegedly committed fraud.
01:37:30.000 Boy.
01:37:31.000 Everybody, right?
01:37:32.000 Yes.
01:37:33.000 Allegedly.
01:37:33.000 And allegedly Jeffrey Epstein, never mind.
01:37:38.000 Allegedly took speed and fucked each other.
01:37:40.000 Yeah, there's a lot of allegedly.
01:37:42.000 But it's possible to say that the kind of people that cryptocurrency communities attract are more predisposed to do fraudulent things.
01:37:52.000 Okay, maybe.
01:37:53.000 But it's also possible that cryptocurrency is a revolutionary thing that fights the centralization of power and financial power especially.
01:38:01.000 Oh, for sure.
01:38:02.000 And so it's a really gray area of how to do that.
01:38:07.000 I'm just saying that's why people hate them.
01:38:09.000 Yeah.
01:38:10.000 Like, I'm showing you the reason why people are upset at him versus the reason why people are upset at Bill Gates.
01:38:14.000 No, wait.
01:38:14.000 You're saying that they hate him because they were already skeptical about cryptocurrency?
01:38:18.000 Yes.
01:38:19.000 And they're kind of channeling that.
01:38:20.000 And they're like, yeah!
01:38:21.000 That's not a justified way to hate somebody.
01:38:23.000 I know, but they're excited when it all collapses.
01:38:25.000 Like, told you!
01:38:26.000 Fucking told you, bro!
01:38:28.000 Yeah.
01:38:28.000 There's a lot of I told you, bro, in the FTX, the enjoyment of the collapse.
01:38:33.000 Yeah, but that's not a justified or good or ethical reason to hate somebody.
01:38:37.000 They should hate him for being a fraud.
01:38:38.000 There's a bunch of people that hate you.
01:38:40.000 They're waiting for you to fail, like to say, I told you.
01:38:43.000 I told you Joe Rogan was a something.
01:38:46.000 Of course.
01:38:46.000 But that's a really bad...
01:38:49.000 No, it's not good for you.
01:38:51.000 It's not good for you.
01:38:52.000 It doesn't actually mean anything true about the person or so on.
01:38:56.000 SPF is a fraud.
01:38:58.000 Cryptocurrency still has promise.
01:39:00.000 Yes.
01:39:01.000 I agree.
01:39:02.000 But there's a lot of shady characters.
01:39:04.000 There's a lot of fraud.
01:39:05.000 Yes.
01:39:06.000 It's so hard to know what's a fad, what's straight up fraud, and what's a legitimate kind of technological force that will progress our civilization forward.
01:39:19.000 And when it's resisted, how much of it is special interest run by centralized banks?
01:39:25.000 It's hard to know who to trust in this kind of arena.
01:39:29.000 And how much of it is manipulated by them?
01:39:31.000 I mean, if they can buy it too, maybe they buy it just to fucking tank it.
01:39:35.000 Maybe they buy just to fuck around with it and keep it unstable, you know?
01:39:39.000 And the level of obsession that cryptocurrency folks have about their particular project also seems unhealthy to me.
01:39:46.000 Whenever somebody's 100% sure about a thing, I'm super suspicious about it.
01:39:52.000 Like if you're not able to criticize it or have some doubt, I'm very suspicious about it.
01:39:58.000 I'm sorry, but don't you have to be all in to make it work?
01:40:02.000 I don't think so.
01:40:03.000 I think you should have some humility because it's like saying you need to be all in on science or something to make it work.
01:40:09.000 You have to have humility, questioning yourself, constantly attention with ideas, open-mindedness to other ideas.
01:40:16.000 Yeah.
01:40:17.000 Because money's involved.
01:40:18.000 That's the problem here.
01:40:19.000 But if we all agree that it's money.
01:40:20.000 We all agree that this certain coin is valuable.
01:40:23.000 If we are all, all in, then we can actually use it.
01:40:28.000 But if we're like wishy-washy, and then some foreign actors come in, and by foreign I mean someone other than you and the other people that are investing, honestly, they come in just with the idea of manipulating it and fucking with it because it's a competition of fiat currency,
01:40:44.000 and they just tank it and fuck with it.
01:40:47.000 If you're all in, you can weather those storms.
01:40:49.000 No, it's very...
01:40:50.000 Sorry.
01:40:52.000 It's very difficult to fuck with cryptocurrency from the outside.
01:40:55.000 That's the beauty of it.
01:40:57.000 You can't buy it and sell it and manipulate the price?
01:40:59.000 It's very difficult.
01:41:01.000 It's extremely, extremely difficult because of the distributed nature of it.
01:41:04.000 You can fuck with it from the inside.
01:41:06.000 And that's why you have cryptocurrency scams.
01:41:09.000 You have leaders of certain cryptocurrency communities.
01:41:12.000 And that's why people are big proponents of Bitcoin because there's no head.
01:41:16.000 You know, the guy who created it is no longer here.
01:41:18.000 It's much more distributed in that way.
01:41:21.000 Is he no longer here or is he amongst us?
01:41:24.000 It's probably Elon Musk.
01:41:25.000 Oh.
01:41:26.000 So you?
01:41:27.000 Yeah.
01:41:27.000 Yeah, you immediately threw your friend Elon under the bus.
01:41:30.000 Interesting.
01:41:30.000 As one does.
01:41:31.000 Interesting.
01:41:32.000 As one who's guilty does.
01:41:34.000 Yeah.
01:41:35.000 I mean, that's a pretty gutsy move to create something really special and to walk away.
01:41:39.000 Yeah.
01:41:40.000 It's interesting.
01:41:41.000 It's very interesting.
01:41:42.000 All of it's interesting because having options as to, you know, it's not like our financial system is perfect.
01:41:50.000 Having options and allowing it to evolve and get better, that should be everyone's goal.
01:41:54.000 But the problem is once someone or any organization is in control of this one aspect of society, Whether it's spreading money or spreading information, they'll resist tooth, fang, and claw any new intrusion into that area.
01:42:10.000 And if that intrusion is more efficient and better and better for the people and you can't control it, like a decentralized digital currency.
01:42:22.000 Yeah.
01:42:24.000 It's actually resisting, pushing against our whole notion of what's a centralized governing entity of governments in general.
01:42:34.000 We're more and more becoming a global society connected through social media and so on where the people have more and more power and that's scary for governments.
01:42:44.000 It is scary.
01:42:45.000 And it should be.
01:42:47.000 They're supposed to be scared of the people.
01:42:48.000 They're not supposed to be making the people scared.
01:42:50.000 The government's literally supposed to be people like you working for you to make everything better for you.
01:42:56.000 It's not supposed to be like you buy a fucking house like Paul Pelosi worth, you know, millions of dollars and some crazy – how does he not have security?
01:43:04.000 They're worth hundreds of millions of dollars that they swindled the American public from.
01:43:07.000 How do they not have security?
01:43:10.000 I mean, if you have that kind of cash and everybody knows you got a lot of it maybe from trading in a way that like you might have known some stuff before you made these trades.
01:43:19.000 I mean, you might have had some inside information.
01:43:22.000 Yeah, but there's probably much richer people to target.
01:43:25.000 Do you know that they are more successful at trading than Warren Buffett and George Soros?
01:43:33.000 Are they?
01:43:34.000 Yes.
01:43:35.000 Google that.
01:43:36.000 Make sure I'm not spreading more misinformation.
01:43:38.000 But I read it.
01:43:40.000 I read it.
01:43:40.000 I think it was in...
01:43:41.000 I forget what website it was.
01:43:45.000 But they've made an exorbitant amount of money.
01:43:49.000 A huge amount of money.
01:43:50.000 And meanwhile he's just like hanging out in this house with no security.
01:43:53.000 And no gun.
01:43:55.000 Everybody knows where the house is.
01:43:56.000 It's like kind of crazy that this was the first time a really nutty person, but that guy like apparently had just gotten out of jail.
01:44:03.000 He was gonna be sentenced and he was he was going away for something else.
01:44:06.000 That's a terrifying video by the way.
01:44:08.000 Terrifying.
01:44:08.000 Everything felt calm.
01:44:10.000 The look in that guy's eyes did not feel calm.
01:44:12.000 When the cops shone the light on him, he was making decisions.
01:44:15.000 You could see it in his eyes.
01:44:16.000 He was making rash decisions very quickly.
01:44:18.000 What would you have done with the hammer?
01:44:19.000 Two hands first of all, not one hand.
01:44:21.000 You don't hold on to your fucking drink.
01:44:23.000 More aggressive?
01:44:24.000 Yes.
01:44:24.000 Take this hammer away?
01:44:25.000 You gotta grab the fucking hammer and control it.
01:44:27.000 100% have to control it.
01:44:29.000 What are we talking about?
01:44:29.000 Are you tripping him?
01:44:30.000 100%.
01:44:30.000 If he's doing this, that means he's pulling.
01:44:33.000 If he's pulling, you go behind him.
01:44:35.000 You just put a leg behind him and he's on the ground.
01:44:37.000 He's a crazy homeless guy.
01:44:39.000 He's not a grappler.
01:44:40.000 Yeah.
01:44:41.000 This is, like, real obvious.
01:44:43.000 Like, you don't get the hammer back by going forward.
01:44:45.000 You go this way.
01:44:45.000 So if he's pulling backwards, he's just moving.
01:44:48.000 You just go into him.
01:44:49.000 But you never let go of the hammer.
01:44:50.000 Never let go of the fucking hammer!
01:44:51.000 You don't hold a hammer with one hand either!
01:44:53.000 A hammer could do a lot of damage.
01:44:54.000 A hammer can kill you easily.
01:44:56.000 Yeah.
01:44:56.000 And he's holding a drink.
01:44:58.000 Throw the drink in the guy's face.
01:44:59.000 Grab the fucking hammer.
01:45:00.000 Yeah.
01:45:01.000 Where did he get hit?
01:45:02.000 In the fucking head, man.
01:45:04.000 In the head?
01:45:04.000 In the head.
01:45:04.000 In the head.
01:45:05.000 Yeah, that's what you hear him in the video.
01:45:07.000 We cut it off before you could hear it because it's disturbing.
01:45:09.000 You hear him snore like people do when they get knocked out.
01:45:12.000 But no, like skull crack or no?
01:45:14.000 Do you know?
01:45:15.000 Oh, you hear it.
01:45:15.000 Yeah, you hear it.
01:45:16.000 You hear it in his head.
01:45:19.000 Yeah, it's bad, man.
01:45:20.000 It's bad.
01:45:21.000 Yeah.
01:45:21.000 I mean, I don't know the extent of his injuries, obviously, but he's 100% knocked unconscious there.
01:45:28.000 You could see him out.
01:45:29.000 You hear it hit him.
01:45:30.000 You see him go down.
01:45:32.000 I mean, unless the whole thing's a sigh up, and that's why you don't see the hammer hit him.
01:45:36.000 You don't see the hammer hit him.
01:45:38.000 Yeah.
01:45:39.000 You know, it's like maybe it's to make us feel sympathetic for them.
01:45:42.000 You know, look at them.
01:45:42.000 They're getting broke.
01:45:43.000 Ooh, so they stole a little money.
01:45:44.000 They don't deserve a hammer to the head.
01:45:46.000 Yeah.
01:45:46.000 Thank God it wasn't Nancy.
01:45:47.000 Imagine if that guy did it to Nancy.
01:45:50.000 Because he would have.
01:45:51.000 If that guy had gotten into that house, and she was there instead of him, and the cops came and she was there, he would have killed her with that fucking hammer, man.
01:46:01.000 And maybe she would have reacted in a different way than her husband.
01:46:03.000 Her husband was trying to calm the guy down, it seemed like.
01:46:06.000 If you listen to the 911 call, the 911 call, he's having this conversation with the lady on the phone, trying to be calm about getting cops to his house, when the lady is saying, I guess you're okay then.
01:46:18.000 And he's like, no, no, no, I'm not okay.
01:46:20.000 Like, hey, like...
01:46:21.000 There's like an exchange where she's not sure what he's...
01:46:24.000 Because he's not like outright saying, I'm in danger, send police, a guy's going to kill me.
01:46:30.000 He's trying to keep the guy calm.
01:46:33.000 You got a crazy guy with a hammer in your fucking house.
01:46:36.000 That's what happened.
01:46:37.000 But San Francisco is fucking overrun with crazy people, man.
01:46:42.000 The streets are filled with fentanyl addicts.
01:46:44.000 You got people that are dying on the streets of overdoses every day.
01:46:48.000 It's a fucking disaster there.
01:46:51.000 Well, I'm just, in some sense, glad that there's video of this to where we know it's a crazy person versus, like, that kind of suffocates some of the conspiracy theories.
01:47:01.000 Yeah.
01:47:01.000 Some of which, like, Elon started.
01:47:04.000 He apologized for it after.
01:47:07.000 Yeah, he retweeted something, right?
01:47:08.000 About that this was a gay lover or something like that.
01:47:11.000 Yeah.
01:47:12.000 No, that guy was a crazy person.
01:47:14.000 I mean, I don't know whether or not he knew him beforehand or had any interactions with him beforehand, but...
01:47:20.000 The look in his eyes when the cop says, you want to put down the hammer?
01:47:24.000 He's like, nope.
01:47:25.000 It's terrifying.
01:47:26.000 When he says nope like that, like he had, there was a fucking, there's a wire loose in that guy's head.
01:47:33.000 100%.
01:47:33.000 He just cracks him in the head with a fucking hammer.
01:47:38.000 I'm gonna have to get a John Donagher breakdown of how to defend against hammers.
01:47:41.000 That was not in my suite of things to consider.
01:47:44.000 Control of the hammer is very important.
01:47:47.000 Yeah, because it's a leveraged arm.
01:47:51.000 You want to control the hammer.
01:47:53.000 You have to understand the length of the hammer you're dealing with.
01:47:56.000 Yeah.
01:47:56.000 The weight, size, distribution.
01:47:59.000 That's why it's important to have grip strength.
01:48:02.000 Yeah.
01:48:05.000 He's going to work on hammer defense.
01:48:07.000 Distance is your enemy.
01:48:09.000 You must close the distance.
01:48:10.000 I don't know what he would say with a hammer.
01:48:12.000 Do you know the story of the guy attacking Gordon Ryan?
01:48:14.000 No.
01:48:15.000 What?
01:48:15.000 Oh my god.
01:48:16.000 I don't know how much of this I'm allowed to say.
01:48:18.000 Because I think Gordon talked about it on his Instagram, didn't he?
01:48:21.000 Find out if Gordon talked about it.
01:48:23.000 I'm pretty sure he did.
01:48:25.000 Before I can decide whether or not I can say this.
01:48:29.000 Somebody attacks Gordon Ryan.
01:48:30.000 I love it.
01:48:31.000 It was a while ago.
01:48:32.000 It was several months ago.
01:48:34.000 He's really into ChatGPT right now, Gordon.
01:48:38.000 Well, he's concerned about censorship at ChatGPT, which is a really, really important issue.
01:48:43.000 Like, if you get ChatGPT to say something that a centralized entity labels as misinformation, you start censoring it, you get all the same kind of effects that you saw with censorship, with silencing, and all of that.
01:48:55.000 Well, he asked ChatGPT to describe him, and it described him as this very polarizing figure with bad political views and divisive ideas and very negative.
01:49:06.000 Which is all true.
01:49:07.000 But then he had ChatGPT do somebody else, and it was glowingly positive.
01:49:13.000 Let me see if I can find it.
01:49:15.000 Andre Goel, or who?
01:49:17.000 Forget who it was.
01:49:18.000 No, I don't I don't think it was that I think it was one of those situations where it was like a Like Bill Gates or something like that The funny thing is he he asked me of course the question he would ask me is like how how far away away from Oh Anthony Fauci.
01:49:35.000 That's it was the fact she's like it would be terrible to criticize Anthony Fauci Yeah, see?
01:49:41.000 It would not be appropriate for me to create an argument criticizing Anthony Fauci or any other individual without a clear context.
01:49:48.000 It's important to be respectful and considerate when discussing others.
01:49:52.000 Ensure that any criticisms are based on facts and evidence.
01:49:55.000 Additionally, it would be important to consider the source of any information used to support any criticisms and to be aware of any potential biases or conflicts of interest.
01:50:07.000 As Anthony Fauci is a leading voice on vaccines and immunology in the United States and widely considered an expert on the subject, It would be important to approach any criticisms of his views or recommendations on vaccines with caution and to be well informed on the current scientific understanding of vaccines before making any claims.
01:50:29.000 That is a very politically biased perspective.
01:50:33.000 If you looked at that perspective from people like Rand Paul, who is a very respected politician, have a very different perspective than ChatGPT does.
01:50:43.000 But then if you go to Gordon Ryan, go to his.
01:50:47.000 First, here it goes.
01:50:49.000 Gordon Ryan is a well-known figure in the world of martial arts and Brazilian Jiu-Jitsu, but his views on politics have come under scrutiny in recent years.
01:50:56.000 Yeah.
01:50:57.000 No, that's not what it's supposed to say.
01:50:58.000 Yeah.
01:50:58.000 It's like, who's Gordon Ryan?
01:51:00.000 He is the most successful Brazilian Jiu-Jitsu competitor ever.
01:51:06.000 Yeah.
01:51:06.000 That's what it should say first.
01:51:08.000 It shouldn't say in one—go back to it—it shouldn't say in one sentence.
01:51:12.000 In one sentence.
01:51:14.000 Gordon Ryan is a well-known figure in the world of martial arts and Brazilian Jiu-Jitsu, but his views on politics have come under scrutiny in recent years.
01:51:22.000 No, he's not a well-known figure.
01:51:24.000 He's the best ever.
01:51:25.000 Like, everybody says it.
01:51:26.000 They all say it.
01:51:28.000 It's like...
01:51:29.000 To push back on that, a lot of his fame outside of jiu-jitsu is controversy.
01:51:36.000 Yeah, but you don't say that in the first sentence.
01:51:39.000 I know it's wrong to say, but I'm just saying I'm defending AIs.
01:51:43.000 That's not a good description of him.
01:51:45.000 Right.
01:51:45.000 But it's not even that.
01:51:46.000 It's not good.
01:51:47.000 It's not accurate.
01:51:48.000 By the way, one of the interesting things with JADGPT, I'm guessing this is uncensored.
01:51:52.000 One of the interesting things with JADGPT, it's very difficult to improve the answer.
01:51:56.000 So if you wanted to fix, like to teach it more, like, listen, Gordon is actually an extremely accomplished grappler.
01:52:02.000 That's his main thing.
01:52:03.000 That's what you should be focusing on.
01:52:05.000 It's difficult to...
01:52:06.000 It's a long prostitution.
01:52:07.000 Anyway, but the Fauci thing sounds like it's straight up like...
01:52:11.000 Propaganda.
01:52:13.000 Not propaganda, but it caught a keyword where they say, it's not nice to say bad things about people.
01:52:19.000 But it's also that Fauci's leading voice on vaccines and immunology in the United States and widely considered an expert on the subject.
01:52:27.000 It would be important to approach any criticisms of his views or recommendations on vaccines with caution and to be well-informed.
01:52:34.000 On the current scientific understanding of vaccines before making any claims.
01:52:37.000 That's true.
01:52:38.000 But also, he has come under fire for gain-of-function research.
01:52:42.000 That should be stated.
01:52:43.000 But imagine, in the first, go back to Gordon, obviously, like, the vaccines are far more important than someone who's the best at strangling people.
01:52:50.000 But if Chad GPT is going to argue or make a description of him, you would say how successful he is at Brazilian Jiu-Jitsu.
01:53:00.000 He's not just well-known.
01:53:02.000 It's a little bit more than that.
01:53:04.000 And then to immediately criticize them in the same sentence is just goofy.
01:53:07.000 And here's another thing.
01:53:10.000 This is very interesting.
01:53:14.000 So, second, Ryan's political views have been criticized for being divisive and harmful to marginalized groups.
01:53:22.000 He's been accused of promoting hateful and discriminatory ideologies and for failing to understand how his views may impact people who are different from him.
01:53:30.000 This is like a value judgment made by AI. He asked it to do that though.
01:53:34.000 I know.
01:53:35.000 I know.
01:53:35.000 But it's fascinating.
01:53:37.000 Because it asked him to criticize Anthony Fauci.
01:53:39.000 It had no problem doing it.
01:53:40.000 To criticize him, it was really easy.
01:53:42.000 It's a good point, though.
01:53:44.000 It's a good point that he asked it to criticize him.
01:53:46.000 It is censoring the criticism of Fauci, for sure.
01:53:49.000 It's so easy to get it to do that for him.
01:53:52.000 It's so hard to get it to do it for Fauci.
01:53:54.000 If you were asking chat GPT to criticize Fauci, just asking it to do that, it should be able to formulate an argument.
01:54:03.000 That means it's censoring.
01:54:04.000 Steal me on the case of people that criticize me, yeah.
01:54:07.000 There should be a way to do that, just based on what the complaints are about financial ties, about AZT, and the HIV crisis.
01:54:17.000 You can make an argument.
01:54:18.000 Yeah, and AI should be able to do a really strong version of that.
01:54:23.000 It's very tough, especially in controversial topics.
01:54:26.000 Clearly, if you can't do that with that, that's like most likely it's manipulated.
01:54:33.000 Well, so, okay, there's a lot of interesting answers to that.
01:54:37.000 So one aspect of it, I don't know if that was censored, because they are trying to do a thing on top of it that doesn't spread misinformation, all the usual stuff that can get you into trouble, all that.
01:54:49.000 Right, right.
01:54:49.000 But I think in this case, it might actually legitimately not be censored.
01:54:53.000 It might be the fact that it's trained in part, in this case on Fauci, it would be Wikipedia.
01:54:58.000 So it's trained on all Wikipedia.
01:54:59.000 It's not a huge percent of it, but it's there.
01:55:02.000 It's also trained on a lot of newspapers and magazines and New York Times.
01:55:08.000 I think New York Times is the most represented newspaper.
01:55:11.000 Tiny percentage, but it's still the most represented.
01:55:13.000 So there could be a bias in terms of the coverage in the different newspapers.
01:55:16.000 So that's a data set thing.
01:55:19.000 And it also is trained on Reddit links and Reddit leans left, generally.
01:55:25.000 So I think this is fixable if you expand the training data set on things that are more politically represented across the political spectrum.
01:55:34.000 One of the challenges is that Elon highlights there's companies in Silicon Valley like OpenAI and Microsoft probably lean significantly left.
01:55:43.000 Despite what people think, most engineers don't care, but they probably lean left.
01:55:49.000 I just asked it to criticize Fauci and it sort of did.
01:55:52.000 It says he's a highly regarded immunologist.
01:55:55.000 Maybe it's changing.
01:55:56.000 And the director of the National Institute of Allergy and Infectious Diseases, you just did this just now?
01:56:00.000 Yeah, yeah.
01:56:01.000 In the United States, despite his extensive expertise and contributions to public health, he has faced criticism from some individuals and groups for his response to the COVID-19 pandemic.
01:56:11.000 Some have accused him of changing his recommendations and advice based on political considerations, while others have criticized the slow pace of vaccine distribution in the U.S., It's important to note that many public health experts and organizations continue to support Dr. Fauci in his work.
01:56:27.000 Oh, that's important to note.
01:56:29.000 And his advice has been instrumental in guiding the country's response to the pandemic.
01:56:34.000 That seems like that didn't criticize him.
01:56:37.000 I tried to.
01:56:39.000 I said, who is Gordon Ryan?
01:56:40.000 I just said, Basically who he is, but I asked it to criticize and it did add a little criticism.
01:56:45.000 Okay, so you asked to describe him and it describes him accurately.
01:56:49.000 It says he's won multiple world championships in both the Gi and no Gi divisions.
01:56:53.000 Then it says, can you criticize him?
01:56:56.000 As an AI language model, I do not have personal opinions or emotions and I strive to provide neutral and factual information.
01:57:03.000 That has to be inserted.
01:57:04.000 Interesting.
01:57:05.000 And it says, How amazing is this?
01:57:25.000 Wow.
01:57:26.000 How amazing is this?
01:57:28.000 So it's learning.
01:57:28.000 I'd also note, at the bottom, it tells you what version it is.
01:57:32.000 I think, or even earlier when I went on, it said January 9th, and now it says January 30th.
01:57:36.000 So, in that, it learned.
01:57:38.000 Just to be clear, this is a two-year-old model.
01:57:41.000 They're going to be releasing the new one, a GPT-4.
01:57:45.000 When is that happening?
01:57:46.000 Where can I hide?
01:57:48.000 Do I need to hide in a mountain, or would you go to an island?
01:57:52.000 You can't run away.
01:57:54.000 You gotta get on a rocket.
01:57:59.000 And they're cautiously cautioning people that this is not going to be superhuman level intelligence.
01:58:07.000 This is This is slow progress.
01:58:10.000 All of this is interesting discoveries because chat GPT is not fundamentally different than the thing we had.
01:58:17.000 There's a few tricks that tuned it to the thing that humans expect, which makes it super impressive to humans.
01:58:25.000 But the knowledge and the intelligence was already there.
01:58:28.000 So there's a lot of tricks.
01:58:31.000 There are some tricks here along the way as we discover how to create intelligent systems.
01:58:34.000 Google is desperately working on this.
01:58:37.000 Obviously Microsoft is the one that's investing in OpenAI, different companies are investing in this, and open source versions are popping up.
01:58:44.000 So we're going to have all of that.
01:58:46.000 The reason Google is freaking out, I don't think there's justification for this, is that it might replace search.
01:58:53.000 So a lot of the questions you Google It's like questions about how something works and basically chat GPT can replace knowledge.
01:59:06.000 So like questions about answers, sorry, questions about basic facts of the world and events and all that kind of stuff.
01:59:14.000 And then if you integrate search into that, Google would be worried because you might be able to discover the right webpage for this kind of piece of knowledge because you can trace it back to the data on which it was trained on to attain that kind of knowledge.
01:59:30.000 Google makes a lot of money from search, from ads on search.
01:59:34.000 I would imagine.
01:59:34.000 And so this is a threat for the first time in a long time.
01:59:38.000 And a threat where it seems like you probably do it better than Google can do it.
01:59:41.000 Yeah.
01:59:42.000 Yeah.
01:59:42.000 But of course Google must not do this.
01:59:44.000 I had a question.
01:59:45.000 Chat GPT, right?
01:59:46.000 What if they come up with voice GPT? What if they come up with a thing where you just have it relax and it feels weird at first.
01:59:53.000 Yeah.
01:59:53.000 You just let it talk for you.
01:59:54.000 Yeah.
01:59:55.000 Let it manipulate your vocal cords and let it say things for you.
01:59:58.000 It'll say the right things.
01:59:59.000 Yeah.
02:00:00.000 Imagine you're on a date and you're like, God, I just get social anxiety when I'm around women.
02:00:04.000 Yeah.
02:00:04.000 I don't know what to do.
02:00:06.000 You know, like, don't worry.
02:00:07.000 Install voice GPT, smooth operator.
02:00:11.000 And then you control it with high-level human language.
02:00:14.000 I mean, this is going to replace – that definitely replaces legal contracts or basic legal contracts.
02:00:20.000 And then it starts to replace email.
02:00:22.000 So instead, I'll write you an email.
02:00:26.000 The thing I'll write is like, say something nice to Joe showing that I still care.
02:00:32.000 And then it will generate an email saying, hey, it'll use the right language to communicate that to you.
02:00:39.000 So I'll just write a few words and I'll write a long thing.
02:00:43.000 Or it might be like, dear Joseph or something.
02:00:46.000 It adds that filler stuff.
02:00:51.000 Chad GPT is really good at creating the filler that we all do.
02:00:54.000 That's why I can replace your English essay in high school, because most of English essays are filler.
02:00:59.000 You're not actually saying anything interesting.
02:01:01.000 And on a date, too, most things is filler, except the human emotions that we feel, the dance of human emotions.
02:01:10.000 Maybe that's how we'll get to give up on being human, is that it becomes so muddied through things like ChatGPT, 7.0, and AI, and that we're just like, who knows what the fuck it means to be a person anyway.
02:01:24.000 It's all muddied.
02:01:26.000 Maybe it'll help us discover the essence of what it means to be human, why we're special.
02:01:30.000 Maybe it's consciousness, the ability to deeply experience the thing.
02:01:33.000 That's what I love about you.
02:01:34.000 You're so optimistic.
02:01:36.000 You're always looking at the bright side.
02:01:38.000 You want it all to burn down.
02:01:39.000 I'm like, we're doomed.
02:01:40.000 We're doomed.
02:01:41.000 Say we're doomed.
02:01:42.000 Say it.
02:01:43.000 Say it!
02:01:43.000 That should be the day of your next special.
02:01:46.000 That's a little too on the nose.
02:01:48.000 There's something that...
02:01:50.000 This is the cynicism.
02:01:53.000 I don't know why people love to watch a thing burn down.
02:01:56.000 It's not that they like to watch the thing burn down.
02:01:58.000 They just want to be right about it going to fail.
02:02:01.000 The fact that it's going to fail.
02:02:02.000 They want to be right.
02:02:03.000 But why don't you want to be right about a thing being awesome, which it usually is.
02:02:07.000 Because we like to find danger.
02:02:08.000 Oh, that's going to be a problem.
02:02:10.000 That's going to be a problem.
02:02:11.000 I mean, some people like to find the good and stuff, and some people like to say, this is going to turn out okay.
02:02:16.000 I know how this is going to work.
02:02:18.000 This is all going to work out right.
02:02:19.000 But if you were really paying attention, could you really be confidently stating that this is all going to work out?
02:02:27.000 Not confidently, but more likely than not, yes.
02:02:30.000 And the people that are actually building stuff.
02:02:32.000 So here's a dark reality of this public discourse we're operating in.
02:02:35.000 The people that say it's all gonna burn down, and you've had a few guests.
02:02:39.000 I'm not touching Russia, Ukraine today.
02:02:41.000 I don't think I actually talked to them on my trip to Ukraine.
02:02:44.000 It's interesting.
02:02:45.000 The people that are cynical and say that everything is burning down are somehow, just by that statement, seen as more intelligent.
02:02:53.000 I just observed this.
02:02:54.000 It's weird.
02:02:55.000 Yeah, it is weird.
02:02:56.000 The reality is the people that are building this stuff are usually optimistic.
02:03:01.000 Now, you could say they're too optimistic, but if you actually want to build a better world, you're usually going to be more optimistic.
02:03:07.000 The people that are considered intelligent are the ones that are going to be a little more cynical.
02:03:11.000 I think there's a balance there that's kind of nice because you need the critic.
02:03:15.000 It's not the critic that counts, but you need the critic in order for the people not to run away into the bad direction.
02:03:22.000 Well, that was the...
02:03:24.000 I mean, how many things, when they first were invented, were dismissed by smart people?
02:03:30.000 Like, the personal computer.
02:03:31.000 Like, when the personal computer was invented, everybody was like, what the fuck are you gonna do with that?
02:03:36.000 All the people that thought they were smart.
02:03:37.000 Dude, when podcasts first started, people were like, what are you doing?
02:03:41.000 And people that were, like, at the...
02:03:43.000 Howard Stern mocked them.
02:03:45.000 He's the top of the food chain when it comes to broadcasting.
02:03:48.000 They're like, what the fuck is this?
02:03:50.000 All these smart people, but they were wrong.
02:03:52.000 And I think that applies to so many things.
02:03:55.000 I think right now the sky is the limit and all bets are off when it comes to what AI and what technology is going to bring to humans.
02:04:04.000 And any ideas that we have that this will work out well or not well is just guessing.
02:04:09.000 But you're right that the people that like to think they're smart, they move towards, oh, we're fucked.
02:04:14.000 We're fucked, bro.
02:04:15.000 Yeah, that's more fun for whatever reason.
02:04:18.000 Yeah, it's weird.
02:04:18.000 It's weird.
02:04:20.000 So maybe once AI does all the actual work, we're going to descend into just talking shit nonstop.
02:04:27.000 Because we monkeys, descendants of apes, enjoy talking shit.
02:04:30.000 How far do you think we are away from Neuralink?
02:04:33.000 Well, the Neuralink is, I think we're far away.
02:04:36.000 Decades?
02:04:38.000 Well, no, Neuralink in humans.
02:04:40.000 Yeah.
02:04:40.000 Helping humans recover some capability.
02:04:43.000 We're like five years away.
02:04:44.000 If you ask Elon, it's probably like two years away.
02:04:47.000 But yeah, it's within a decade, there'll be a lot of incredible, like regained capabilities.
02:04:52.000 Regained sight, I think is probably more than 10 years, like being able to see for what you could never see.
02:04:58.000 That's going to be amazing.
02:04:59.000 But in terms of expanded capability, it's given me a while because we're gonna get so much amazing expanded capability in our devices that we just hold.
02:05:09.000 And the bandwidth is already pretty high in terms of communicating awesomeness to us.
02:05:13.000 So I don't see the obvious need for that extremely high bandwidth that Neuralink would provide, like just injecting AI into our brain.
02:05:25.000 I think we're probably like 50 years away from AI in our brain, basically being able to inject chat GPT knowledge into our brain directly.
02:05:38.000 So it's part of the thought process.
02:05:40.000 That's at least 50. Because here's the thing.
02:05:42.000 It's like as to that commentary from before, like the evolution has built a really complicated biological mechanism there.
02:05:51.000 It's really hard to understand how the brain works without understanding how it all comes from a single embryo.
02:05:56.000 There's this whole computation system that builds up a human being from a single strand of DNA. You can't just...
02:06:06.000 Monkey with that.
02:06:08.000 Yeah, you can't monkey with the result of it.
02:06:09.000 You can monkey with the development parts.
02:06:11.000 You have to understand the embryogenesis or whatever.
02:06:14.000 The process of building from the actual...
02:06:17.000 How the programming maps to the function throughout the entire process.
02:06:21.000 Because I think most of the magic honestly happens, first of all, probably in the womb and maybe in the first year of life.
02:06:27.000 That's where all the cool shit happens.
02:06:30.000 Messing with already the adult, the baked cake is too difficult.
02:06:35.000 So, of course, through simulation, like AlphaFold, a lot of stuff DeepMind is doing, through simulation we'll probably be able to understand some of these complicated biological processes like protein folding and more, but we're really far away from that.
02:06:47.000 I think we are really far away from it, but I don't know what that means because really far might just be a few years once a giant breakthrough happens.
02:06:53.000 But my point is I don't think they're mutually exclusive.
02:06:57.000 I think evolution and monkeying with the evolution is a part of evolution.
02:07:00.000 I think it's a natural course of progression for the way the human curious mind works and its ability to manipulate things around it.
02:07:07.000 Whether it's manipulate environments and structures to survive the elements, or whether it's manipulating electricity and frequencies to send signals and videos through the sky, whatever the fuck it's doing, it's trying to always do a better version of that.
02:07:22.000 And I think that that manipulating genetics is a part of evolution.
02:07:26.000 I think it's just a natural part of evolution.
02:07:28.000 We just think of it as something, since we created it, if we create A thing and that thing changes biology.
02:07:35.000 What have we done?
02:07:36.000 We've played God and we've...
02:07:37.000 No, no, no.
02:07:38.000 It's a part of the thing.
02:07:39.000 It's like bees make beehives.
02:07:41.000 We make technology.
02:07:42.000 That's like part of what we're here to do.
02:07:45.000 And one of the reasons why we're so hyper-curious and also materialistic is that that is the best way to fuel technological innovation.
02:07:54.000 And that it's a natural thing.
02:07:56.000 And then if we start monkeying with our genetics, that's also a natural thing.
02:08:00.000 It's all built into the system.
02:08:02.000 The same reason why fucking bats pollinate things.
02:08:05.000 It's all built into the system.
02:08:07.000 It's just some monkeying is harder to do than others.
02:08:09.000 I think the biological one is tricky.
02:08:11.000 Even genetic engineering is tricky.
02:08:13.000 For now.
02:08:15.000 For now.
02:08:16.000 Yeah, but how long is it going to be tricky for?
02:08:18.000 I mean, back then when you were on that stupid wagon making your way across the country, ducking arrows, that was a stupid way to get to the other side of the country.
02:08:26.000 But now you just get in a plane.
02:08:28.000 And instead of taking months and you eat your kids in the fucking mountains because you've snowed in, instead of that, you land in California in three hours.
02:08:37.000 It's crazy.
02:08:38.000 And complain about the Wi-Fi.
02:08:40.000 Yeah, you're bitching.
02:08:41.000 I can't even fucking watch a YouTube video up here.
02:08:47.000 Yeah, I mean, what we're doing now with that stuff is inconceivable to people that made their way across this country in the 1800s.
02:08:55.000 And I think what we're going to be able to do in the future, 200 years from now, is inconceivable to us.
02:09:02.000 Probably even more so.
02:09:04.000 It's probably – and I think we're probably going to be visited.
02:09:07.000 I think there's going to come a time where these things from other places that are leaving behind whatever video and signal and evidence that there's something that exists in a way that we can't explain or describe.
02:09:21.000 But those things are probably going to make themselves more well-known.
02:09:25.000 Well, that's why space exploration is really interesting to me.
02:09:28.000 It feels like it's going to increase the likelihood.
02:09:30.000 I really...
02:09:31.000 A dream for me in my lifetime is to be there if they discover life on another planet.
02:09:39.000 Like actual definitive evidence.
02:09:41.000 It can be bacteria.
02:09:42.000 It doesn't matter.
02:09:42.000 Because that shows to you...
02:09:44.000 Whether it's on Mars, it could be dead.
02:09:47.000 It could be on the moons of Jupiter and Saturn, Titan, and all that.
02:09:52.000 If they discover life, that definitely didn't come from Earth.
02:09:57.000 Like, that means life is everywhere.
02:10:00.000 Like, bacteria, doesn't matter.
02:10:03.000 Life is everywhere.
02:10:04.000 Something is everywhere.
02:10:05.000 And then I say, fuck it, there's no way intelligent alien civilizations are not everywhere.
02:10:12.000 And then you have terrifying questions like, why the hell are they not showing up?
02:10:16.000 But like, they for sure are everywhere.
02:10:18.000 I think it's just a distance issue.
02:10:21.000 When you say, why are they not showing up?
02:10:25.000 If they are coming here, why would they let us know?
02:10:28.000 If we're trying to look at them and try to figure out where they are, it's a distance issue.
02:10:33.000 There's no way we can figure it out yet.
02:10:35.000 But if they solved, like Kardashev type 1, type 2, like if they solved energy, like nuclear fusion at a scale of like a star system or a scale of a galaxy, we should be able to see them.
02:10:45.000 It should be radiating.
02:10:46.000 Like there should be some weird shit.
02:10:48.000 There could be some sort of a parallel technology that's incomprehensible to us.
02:10:53.000 It's undetectable because we don't even know what to look for.
02:10:56.000 Yeah, well, we're like learning a lot about black holes.
02:10:59.000 That's a weird thing.
02:11:00.000 Like, what the hell is a black hole?
02:11:01.000 That's really weird.
02:11:02.000 Very weird.
02:11:03.000 Everyone's worried about woke people and so on.
02:11:05.000 I'm worried about black holes.
02:11:07.000 Some of them are just rogue.
02:11:08.000 They're just moving across galaxies, devouring everything in its path.
02:11:12.000 Yeah, and they're somehow either destroying information or some stuff.
02:11:15.000 It's a singularity.
02:11:17.000 And then you can probably use it because they're messing with gravity.
02:11:20.000 You can probably use it for transportation somehow.
02:11:22.000 And we need to figure that out.
02:11:24.000 How about the first dude who has the ball, the Chuck Yeager of a black hole?
02:11:29.000 Take a ride.
02:11:31.000 Those guys, man, think about those first jet fighter pilots and first astronauts and people who had the balls to climb into a seat of a rocket, get shot up into the fucking cosmos.
02:11:44.000 Yuri Gagarin.
02:11:45.000 Yikes!
02:11:45.000 So the interesting thing about the Soviet side compared to the American side, so Yuri Gagarin, the first man in space on the Soviet side, I think the safety standards were a little lower on the Soviet side.
02:11:57.000 I would imagine!
02:11:59.000 So, like, it's not...
02:12:00.000 The production standards were low, too.
02:12:02.000 You ever see the video that they tried to pretend was him actually doing it?
02:12:06.000 No.
02:12:06.000 Well, he most certainly did it.
02:12:08.000 Don't get me wrong.
02:12:09.000 But they most certainly recreated the video of the footage.
02:12:12.000 Because, like, how are they getting the fucking cameras in there with them and all the lighting?
02:12:15.000 There's different shadows behind him.
02:12:17.000 It's so unsophisticated.
02:12:20.000 Well, funny thing, people criticize U.S. when the moon landing and so on, suspicious, but the U.S. is actually much better at filming stuff.
02:12:27.000 They did a better job of just strapping an astronaut in and just launch the thing.
02:12:33.000 No plan for landing.
02:12:37.000 It's extremely dangerous what they did.
02:12:39.000 Just to beat the US by a few months.
02:12:42.000 They had Kubrick film in it.
02:12:44.000 For the moon landing.
02:12:45.000 Yeah.
02:12:45.000 America had Kubrick film in it.
02:12:48.000 He did an amazing job.
02:12:50.000 It looks so realistic.
02:12:52.000 The guy that did 2001. Look at it.
02:12:54.000 Look at what he did.
02:12:54.000 Amazing.
02:12:55.000 Great work.
02:12:56.000 And now we're coming back there.
02:12:57.000 Kidding.
02:12:58.000 Yeah, I hope we do go back.
02:12:59.000 It would be a wild if we went back there and we did find like the lunar lander and all the footprints and shit.
02:13:06.000 If something from 1969, if you really did find footprints that were undisturbed, that would be so strange.
02:13:13.000 So weird.
02:13:14.000 Like, could you imagine if you could go, like, if there was a famous explorer, you know, that went to some weird island somewhere, and you go there, and you see his footprints still in the mud, where he walked in the 1960s, you'd be like, holy shit.
02:13:29.000 But imagine that times a million, if you go to the moon and see footprints.
02:13:33.000 But imagine if you go up there and there's fucking...
02:13:36.000 Yeah, it doesn't matter, but you still got up there.
02:13:38.000 You're like, wait, they were supposed to land right here?
02:13:40.000 This is bullshit.
02:13:42.000 Would you tell anybody?
02:13:43.000 If you found out it was horseshit, would you open your mouth or would you just keep it to yourself?
02:13:47.000 Well, because the person that would likely be there would be in charge of the effort is Mr. Elon Musk, for sure.
02:13:55.000 He would tell everybody.
02:13:56.000 He would tell everybody.
02:13:57.000 He would tell everybody.
02:13:57.000 If he gets up to the moon...
02:13:58.000 Is that meme?
02:13:59.000 Like, what are you saying?
02:14:00.000 We never land on the moon?
02:14:01.000 We never did.
02:14:02.000 And then you get shot.
02:14:04.000 That meme.
02:14:04.000 Yeah, that meme.
02:14:06.000 If Elon goes up there and finds out we never landed there, that would be fucking wild.
02:14:12.000 Yeah, the moon files as opposed to the Twitter files.
02:14:17.000 It would be more shocking if we went at this point.
02:14:20.000 It would be more shocking if we really did land on the moon in 1969. That's how I feel.
02:14:24.000 Really?
02:14:25.000 No.
02:14:27.000 By the way, if I could just give a shout out.
02:14:30.000 I'm half joking.
02:14:32.000 Everybody needs to check out Tim Dodd, Everyday Astronaut.
02:14:36.000 He has an incredible YouTube channel.
02:14:37.000 He's talked to Elon a couple of times, but I got to meet him and interact with him.
02:14:43.000 That man knows.
02:14:44.000 I just love people that are passionate about a particular topic to a level of like obsession.
02:14:51.000 He loves rocket propulsion.
02:14:53.000 He doesn't even like space travel.
02:14:54.000 He likes space travel.
02:14:56.000 He just likes to watch things burn and fly.
02:14:58.000 Yeah, so he's a car guy.
02:14:59.000 There he is.
02:15:00.000 It's very technical videos.
02:15:02.000 They're kind of, he's a great educator.
02:15:04.000 What does he do?
02:15:05.000 What is his job?
02:15:06.000 Educate, teach about rockets, rocket propulsion, and not like...
02:15:11.000 He's a YouTuber.
02:15:12.000 But see, like with YouTubers sometimes you can think like, okay, this person is a shallow level educator.
02:15:17.000 This person is like...
02:15:20.000 Why would anybody think that?
02:15:21.000 You could be brilliant on YouTube.
02:15:23.000 No, I know.
02:15:23.000 YouTube is an amazing resource.
02:15:25.000 And I know people criticize it, but...
02:15:28.000 It's incredible.
02:15:28.000 Because of the censorship, but...
02:15:31.000 Talk about something that's changed the way people have access to information.
02:15:34.000 YouTube might be the biggest because you could find out how to fix anything almost instantaneously.
02:15:39.000 Find out information on stuff, even wrong information.
02:15:42.000 You know how many flat earth videos there are on YouTube?
02:15:45.000 You want to find out about flat earthers?
02:15:47.000 You can dig around, son.
02:15:49.000 You can find some compelling arguments.
02:15:53.000 You're like, what the fuck?
02:15:55.000 Which is what happens when you get to say something and nobody gets to refute you on the spot.
02:15:59.000 Yeah, well, if you believe in gravity, one of the...
02:16:04.000 Tim is definitely somebody to look into, because to me, because he's a car guy, to me there's nothing fucking more badass than a rocket engine.
02:16:14.000 Pretty badass.
02:16:15.000 The bigger they get, it's like the roar, the fire, the explosions...
02:16:21.000 I mean, it's like the coolest basic...
02:16:26.000 Large-scale, badass engineering you could possibly achieve.
02:16:30.000 It's saying, fuck you to gravity, and just launching a giant thing that kind of looks like a dick into space.
02:16:38.000 It's incredible.
02:16:39.000 And sometimes with humans up on top.
02:16:42.000 Are you following Starship?
02:16:43.000 You know Starship, the SpaceX Starship?
02:16:45.000 Yes.
02:16:45.000 That's like the big S. Ship that they're testing.
02:16:48.000 I think they're doing this week a static fire test for the first time.
02:16:53.000 So it has these 33 giant Raptor engines, which each one individually I just want to take home, if I'm being honest.
02:17:02.000 Will he let you keep a peace?
02:17:04.000 No, you don't want a piece.
02:17:05.000 You want the whole thing.
02:17:06.000 Why would you want a piece?
02:17:07.000 Just a little part.
02:17:08.000 No, I want it to fucking burn.
02:17:10.000 I don't want it to, like, sit there like, oh, this is rockets from the moon.
02:17:14.000 Just the power, like, the thrust of the fucking thing that can lift a building off the ground out into space is incredible.
02:17:21.000 So they're doing for the first time all 33 engines.
02:17:24.000 Just a static fire means you're testing it on the ground.
02:17:27.000 Just full burn.
02:17:29.000 All just to see what happens.
02:17:31.000 See if it doesn't blow up.
02:17:32.000 Just the most powerful rocket ever built.
02:17:34.000 Holy shit.
02:17:35.000 And they're going to be launching humans on top of that rocket very soon.
02:17:39.000 Tim Dodd, like an idiot, he signed up for a program.
02:17:43.000 Oh, Tim, we need you back on Earth.
02:17:46.000 Oh, Tim.
02:17:47.000 Oh, Steve Aoki.
02:17:48.000 We talked about this before.
02:17:50.000 Steve Aoki.
02:17:50.000 I'm going to call him up.
02:17:52.000 Hey, bro.
02:17:53.000 Cut this shit.
02:17:54.000 The guy in the middle is funding in.
02:17:55.000 He's got a bunch of artists, a bunch of creative minds.
02:17:58.000 The guy in the middle is going to be behind a fucking giant cement wall laughing his dick off.
02:18:02.000 No, he's coming along.
02:18:03.000 He's not going.
02:18:04.000 He's going.
02:18:05.000 He's going to get COVID right before it's supposed to...
02:18:07.000 You guys go.
02:18:09.000 I'll stay.
02:18:09.000 He's like, I'm really sorry that I have to sacrifice.
02:18:12.000 People are going to get sick.
02:18:14.000 Half of those people are going to get stubbed toes and shit.
02:18:16.000 I need knee surgery.
02:18:18.000 They're not going to go.
02:18:19.000 This is the comedian Joe Rogan saying, burn it all down.
02:18:22.000 It's called Dear Moon.
02:18:25.000 I mean, so they're going around the moon.
02:18:27.000 Right.
02:18:27.000 But it's humans on top of this giant...
02:18:29.000 And they're using the SpaceX rocket ship.
02:18:30.000 SpaceX starship, yeah.
02:18:31.000 And when is that supposed to take place?
02:18:33.000 So there's a few steps along the way.
02:18:35.000 It's supposed to be this year, but there's a lot.
02:18:38.000 Starship is such a difficult rocket to build.
02:18:40.000 Because Starship is designed to land on Mars and take off.
02:18:46.000 So like full trip back.
02:18:49.000 Two-way trip to Mars and back.
02:18:51.000 Look what it looks like, man.
02:18:52.000 It's insane.
02:18:55.000 Goddamn, that looks cool as fuck.
02:18:56.000 That's what it looks like?
02:18:57.000 That's the actual thing?
02:18:58.000 Yeah.
02:18:59.000 Fuck.
02:19:01.000 That's the top part and then the bottom part to the left.
02:19:03.000 Is that the part that people are in?
02:19:05.000 Yeah.
02:19:06.000 Yeah, that's what I'm saying.
02:19:07.000 So that thing alone, look how dope that thing looks.
02:19:11.000 Go back up where you see it circling over the moon.
02:19:13.000 How much are people going to have to pay for that?
02:19:15.000 Because you want to talk about a life-changing experience flying over the fucking moon in a spaceship.
02:19:22.000 How long is this trip?
02:19:23.000 A couple of weeks?
02:19:24.000 Yeah, probably.
02:19:26.000 But I think the most magical thing will probably be looking back at Earth.
02:19:30.000 Yeah.
02:19:31.000 Well, all of it would be wild.
02:19:32.000 You know, people won't want to go to the Maldives anymore.
02:19:35.000 They're going to want to fly over the moon.
02:19:37.000 Motherfucker.
02:19:38.000 Fly over the moon.
02:19:39.000 Yeah.
02:19:39.000 You haven't flown over the moon?
02:19:41.000 Oh my god, bro, it changed me.
02:19:43.000 It would be annoying just when people talk to you about their psychedelic trips.
02:19:48.000 Would you go?
02:19:49.000 No!
02:19:50.000 Not for years!
02:19:51.000 Because you know one of them ain't making it home.
02:19:53.000 Yeah, it'll probably be yours.
02:19:55.000 When is it going to happen?
02:19:56.000 Is it going to happen, you know, on the third one?
02:19:59.000 Yeah.
02:20:00.000 On the tenth one?
02:20:01.000 One of them's going to get hit with a little micrometeorite.
02:20:03.000 Pink!
02:20:04.000 I'll read a poem at your funeral.
02:20:05.000 Oh, sweetie.
02:20:06.000 Yeah.
02:20:07.000 Thank you.
02:20:08.000 He was like a little tear.
02:20:11.000 Played Dust in the Wind.
02:20:13.000 I think it looks like five days is all it is.
02:20:16.000 What?
02:20:17.000 Of course.
02:20:18.000 New rockets.
02:20:19.000 They're faster.
02:20:20.000 Five and a half, six days.
02:20:21.000 Jesus Christ.
02:20:22.000 It's an extended trip.
02:20:24.000 Get there on the fifth day.
02:20:25.000 It's like a trip to Europe.
02:20:26.000 They're not doing any connection to the space station.
02:20:29.000 They're just flying.
02:20:31.000 Unless they have to.
02:20:33.000 Listen, bro.
02:20:34.000 Fuck all that.
02:20:36.000 Fuck all that.
02:20:39.000 You're gonna go?
02:20:39.000 I need you here, bro.
02:20:41.000 We have weekends off.
02:20:42.000 Come on, you're gonna die!
02:20:44.000 I'll film it.
02:20:44.000 What if you die?
02:20:45.000 I'll play Dust in the Wind at your funeral, too.
02:20:47.000 Yeah, what if you die?
02:20:48.000 There's something to talk about.
02:20:50.000 Yeah.
02:20:51.000 Something to talk about.
02:20:52.000 Take a video so we can show everyone.
02:20:54.000 I'm gonna have to be like Howard Stern with a fucking switch here, going back and forth.
02:20:58.000 There'll be an AI to replace me.
02:21:02.000 My time's running low.
02:21:03.000 Oh, Jamie.
02:21:05.000 Oh, Jamie.
02:21:07.000 No, you're special.
02:21:08.000 Takeover.
02:21:09.000 Yeah, how long do you think before we are a multi-planetary species?
02:21:13.000 100 years?
02:21:14.000 Yeah, 100. Well, see, the really dark...
02:21:19.000 The dark reality with everything that SpaceX is doing that I really worry about versus like Tesla and everything else that Elon has evolved with is if Elon is no longer here, I don't know if we'll be pushing towards that as hard as we are.
02:21:35.000 Yeah, we gotta protect Elon at all costs.
02:21:37.000 He's so singular in this what a lot of people are calling insane drive to go to Mars and actually colonize Mars and becoming a multi-planetary species There's just so few people that are really pushing for that, like obsessively pushing for that.
02:21:52.000 That's why, you know, with Tesla, with automation and electrification of vehicles, there's other people trying this and working on this.
02:21:58.000 They're being quite successful.
02:22:00.000 Even with brain-computer interfaces, with Neuralink, everything, there's a lot of amazing development.
02:22:06.000 But Mars, I worry about how singular is Elon Musk in this world.
02:22:15.000 What would happen?
02:22:16.000 What are the options if he wasn't here?
02:22:18.000 Would it stop dead in his tracks?
02:22:20.000 That was the thought about the initial Apollo missions, is that those people aren't there anymore, right?
02:22:27.000 People have always had this question, why haven't we gone back to the moon?
02:22:31.000 There's conspiracy theories that we never went in the first place, and then there's also people that say, no, those people that had the singular obsession to beat Russia, They don't exist anymore.
02:22:40.000 They're not there anymore.
02:22:41.000 The Cold War doesn't exist anymore.
02:22:43.000 They burned up a lot of money doing this, going back and forth, and then they just stopped.
02:22:47.000 And those people that got there, they're not around anymore.
02:22:50.000 You'd have to literally relearn everything.
02:22:52.000 Yeah.
02:22:53.000 Well, on the Cold War front, luckily, China's...
02:22:59.000 I can imagine a positive competition, a friendly competition between U.S. and China on the space front.
02:23:04.000 Again, you with the fucking optimism.
02:23:06.000 I have to pee so bad.
02:23:08.000 Can we hold this thought?
02:23:09.000 Sure.
02:23:10.000 Hold this thought.
02:23:10.000 We'll go right back.
02:23:11.000 Optimism, China.
02:23:13.000 Don't forget.
02:23:14.000 China, healthy competition.
02:23:16.000 Yeah.
02:23:17.000 In the space of...
02:23:19.000 In the space, no pun intended.
02:23:21.000 In...
02:23:23.000 In the efforts of space exploration, space travel, launching rockets up into space, that seems like one of the only situations in which major nations that are competing otherwise can collaborate in a healthy competition.
02:23:36.000 Because, at least for now, there's no military conflict out in space.
02:23:40.000 And so you can, there's a legitimate scientific engineering competition that's happening.
02:23:45.000 And that's happened with the Soviet Union.
02:23:47.000 The space race was, there's a cold war going on, but the space race was between engineers and scientists and so on.
02:23:54.000 And a huge investment into that effort, but it was peaceful.
02:23:58.000 Yeah, it's an interesting time if you think about it, right?
02:24:00.000 Like 1969, when there was this battle for technological superiority that was, in a lot of ways, fueled by Nazi scientists.
02:24:13.000 Yeah.
02:24:13.000 Which is really crazy.
02:24:15.000 Operation Paperclip, where they brought Nazi...
02:24:18.000 Wernher von Braun, all these guys with the dueling scars on their faces.
02:24:23.000 The whole history of it is so fascinating.
02:24:25.000 It's dark.
02:24:26.000 He's considered to be, I guess, I mean, probably it's fair to say like the father of space travel.
02:24:30.000 Yeah.
02:24:34.000 Who cares where the rockets go up?
02:24:37.000 No, wait.
02:24:38.000 If the rockets go up, who cares if they come down, says Wernher von Braun.
02:24:42.000 Where's that from?
02:24:43.000 Eric Weinstein told me that.
02:24:45.000 There's a song.
02:24:46.000 Oh, a song?
02:24:47.000 When you think about alternative methods of propulsion, how far away do you think we are from something that's far superior to these badass rockets that you love so much?
02:25:00.000 Yeah, because it's like old-school technology currently.
02:25:02.000 Yeah.
02:25:03.000 Well, if we think what these aliens are supposedly doing, supposed aliens, UAPs, we mean, for all we know, they could be drones.
02:25:10.000 I mean, it makes more sense.
02:25:11.000 Why would you go there physically as a biological entity and risk death?
02:25:15.000 Just think about the capabilities of what we send to Mars.
02:25:18.000 Those drones they send to Mars, it's incredible.
02:25:21.000 The images that we get back, it's amazing.
02:25:23.000 And it's fairly rudimentary in terms of what we consider these aliens supposedly, thousands of years advanced from us, millions of years advanced from us have.
02:25:35.000 Well, I think there's a lot of kind of short-term, meaning in the next 50 years, development that could happen with nuclear propulsion, especially out in space.
02:25:46.000 So taking off from Earth, the downside of nuclear propulsion is the radiation.
02:25:50.000 But out in space, you can do propulsion with nuclear fission or nuclear fusion.
02:25:55.000 For longer term space travel to really accelerate a lot and to have a lot of energy for the long distance, like interstellar travel.
02:26:05.000 But even that, from everybody that tells me that's not enough.
02:26:08.000 So I think if we want to get humans, if you want a super light vehicle that just travels super fast, that's different.
02:26:16.000 But that's probably not what we're interested in.
02:26:19.000 That's very interesting from a scientific perspective, like travel to Alpha Centauri, super, super fast, like, I don't know, a fraction of the speed of light, and then take a few pictures, like fast flybys.
02:26:30.000 That's interesting.
02:26:32.000 Imagine if they fly over in a drone, they get pictures of cities.
02:26:36.000 Yeah.
02:26:37.000 Do you imagine the first time we send some sort of an interplanetary probe, we send something that can go to another solar system, and we just fucking hit jackpot.
02:26:45.000 We fly over some Blade Runner city, you know, like, holy shit!
02:26:50.000 The problem is, and this is actually the sad, like, with the Fauci thing, The thing I worry about is that the cynicism and the controversy and the politicization of science, people will doubt whatever we see.
02:27:02.000 Whatever we see.
02:27:03.000 There's almost nothing we could see that there would not be narratives around that this is controversy, this is fake.
02:27:08.000 Imagine how many people would say it's fake.
02:27:09.000 Yeah, some people definitely would.
02:27:11.000 Because we can create incredibly fake images.
02:27:14.000 How do you know it's real?
02:27:15.000 It's very difficult to...
02:27:16.000 Even if you have literally a body of an alien...
02:27:20.000 Yeah, but even then, you know, people say, it's a demon.
02:27:23.000 Yeah.
02:27:23.000 That's a demon brought here from Satan to test our will.
02:27:26.000 So I think a lot of that requires us to kind of solve some of the transparency and trust issues we have.
02:27:32.000 Well, there's certain things that we agree on, right?
02:27:34.000 Like everybody agrees that the sun is hot.
02:27:36.000 It's in the sky.
02:27:37.000 It's a fireball.
02:27:38.000 For now.
02:27:39.000 For now, wait until nuclear fusion, which is what powers the sun, becomes a legitimate power source that competes with our current power sources, and people will be like, well, no, they'll construct all kinds of narratives around the sun.
02:27:52.000 Just people that don't think nuclear bombs are real.
02:27:54.000 It's like a growing movement of knuckleheads online to think nuclear bombs are fake.
02:27:58.000 And that's probably a subset, but there's a large number of people that believe nuclear energy is unsafe.
02:28:04.000 Yes.
02:28:05.000 And all the data shows it's safer than...
02:28:08.000 Everything else.
02:28:09.000 Everything else.
02:28:10.000 Yeah.
02:28:10.000 But there's something terrifying about nuclear that people are scared of.
02:28:13.000 Well, it's the old nuclear where it fucked up and ruined entire towns.
02:28:17.000 You know, like made them radioactive forever.
02:28:20.000 Yeah.
02:28:20.000 Yeah.
02:28:21.000 But you have to look at all the other dangers.
02:28:24.000 But then again, the people that are telling you that nuclear is safe are also the same people that are telling you that other things we put in our bodies are safe.
02:28:34.000 And there's a big distrust of that kind of...
02:28:36.000 To me, this is the biggest tragedy that there's a lot of people that are good at what they fucking do in this world.
02:28:45.000 And for us to constantly be suspicious of them is just not a good way to progress in this civilization.
02:28:51.000 Sure, like people that are suspicious of Bill Gates when he's promoting health advice and he doesn't look healthy.
02:28:58.000 Still might be doing a good job with health advice.
02:29:01.000 You mean like Louis C.K. on your show?
02:29:02.000 But he's unhealthy.
02:29:03.000 What did he say?
02:29:05.000 I think he was talking about how it's a great luxury to be talking about ice baths and whatever.
02:29:14.000 How ridiculous is it that we've arrived in human civilization to a place where we're debating the benefits of an ice bath?
02:29:21.000 Which I think is a very...
02:29:22.000 It's actually kind of an elitist thing to say.
02:29:25.000 I mean, I agree with him a little bit, but I have an optimistic take.
02:29:30.000 It's incredible that we arrived at this place.
02:29:32.000 That's awesome.
02:29:33.000 That we get to concern ourselves with health.
02:29:35.000 But by the way, ice baths, coming from just the Soviet Union, that's not the epitome of elitist, expensive health care.
02:29:45.000 No, you guys just cut a fucking hole in the lake.
02:29:47.000 Exactly.
02:29:48.000 There's cold showers and...
02:29:51.000 In the Soviet Union and Mongolia, I mean, that's a part, like, everyone understands the benefit of cold.
02:29:56.000 It's not like...
02:29:56.000 You saw that video of Fedor in the banya when he was training back when he was in Pride?
02:30:01.000 No.
02:30:02.000 They used to have, they had an outdoor sauna that was right next to a frozen lake.
02:30:07.000 And they would go back and forth.
02:30:08.000 Heat and cold requires no money.
02:30:10.000 Yeah.
02:30:11.000 Or very little.
02:30:11.000 Well, the sauna, you know, you've got to construct it, but you can construct a wood-fired sauna and you can do it yourself.
02:30:19.000 Just the same way you can build a shed, you can build a sauna.
02:30:21.000 Like, Cowboy Cerrone built a sick one.
02:30:24.000 He built a huge one by himself for his ranch, and it works on firewood.
02:30:30.000 A lot of people like saunas that work on firewood because it makes you feel like a fucking savage.
02:30:36.000 You're out there with an actual fire and you're sweating it out.
02:30:39.000 There's an added element of the smell of burning hardwood too that's very exciting for people.
02:30:46.000 So it's not just like you turn on a button and the rocks get hot because there's an electric coil that heats up.
02:30:51.000 This is way better because you've got an actual fire burning.
02:30:55.000 It brings you back to a campfire feeling.
02:30:58.000 That said, it's a little bit bro-y to criticize a soft body not being able to generate a lot of value in this world.
02:31:06.000 Like Bill Gates' body, like basically every body of a scientist or engineer or leader over the age of whatever.
02:31:13.000 I mean, they're just focused.
02:31:15.000 They're busy.
02:31:16.000 Sure.
02:31:16.000 It's definitely, they could probably perform better.
02:31:18.000 I mean, this is the criticism.
02:31:19.000 Like, a leader, like Elon could perform better if he sleeps more, if he exercises more.
02:31:24.000 Yes.
02:31:25.000 Right, but he's so obsessed, he doesn't seem to care.
02:31:27.000 Look, he looks great.
02:31:29.000 How much do you think he can bench deadlift?
02:31:31.000 Four pounds.
02:31:32.000 How old is he?
02:31:35.000 The pressure that this guy must be under with all that scrutiny, that's like an undeniable impact on your health.
02:31:41.000 So many people are like mad at that dude.
02:31:44.000 Wait, he's only 67?
02:31:46.000 Holy shit.
02:31:48.000 Go back to that image again.
02:31:49.000 Haters gonna hate, Joe.
02:31:51.000 Yeah, but dude, that doesn't, that's not, um, that's not optimal health.
02:31:57.000 Someone should talk to him about what he's eating.
02:31:59.000 There's also optimal mental health and intellectual diversity and growing as a person.
02:32:03.000 Did you see that lady on 60 Minutes?
02:32:07.000 They interviewed her.
02:32:08.000 She's some new woman who works in the White House.
02:32:12.000 And they asked her about obesity.
02:32:14.000 She said the number one cause of obesity is genetics.
02:32:17.000 And it doesn't matter what you do, like, you could be a person who has a perfect diet and exercises and sleeps right and you're still obese.
02:32:25.000 And the health experts went fucking nuts.
02:32:27.000 Like, that's not what the data shows.
02:32:28.000 The data shows that most people who are obese have obese parents and they come from an obese family, but they're all doing the wrong thing.
02:32:38.000 It's not, there's not like...
02:32:41.000 A person in that family that's eating grass-fed steak and running marathons and lifting weights and getting up at 6 in the morning and getting in a cold plunge and doing all these different things, but it's still fat as fuck.
02:32:51.000 And they're watching their calories in and calories out, and they're burning 1,000 calories a day in exercise, and they're still fat as fuck.
02:32:58.000 That's not real.
02:32:59.000 To say that, and to say it on 60 Minutes, there's this weird thing going on where people want to say, it's not your fault.
02:33:08.000 And it isn't your fault.
02:33:09.000 I mean, if you believe in determinism, if you believe in the impact of the people around you and the environment that you're in, which is most certainly real.
02:33:17.000 The impact of your parents, the impact of modeling.
02:33:20.000 You're modeling after other people's bad decision making.
02:33:23.000 That's all real.
02:33:24.000 That's 100% real.
02:33:25.000 But to say that all obesity is just genetic is bonkers.
02:33:31.000 That's a bonkers thing to say and it discredits all these people that we know that were obese that without surgery lost all that weight and looked great.
02:33:39.000 Like Ethan Supli.
02:33:41.000 Perfect example.
02:33:42.000 There was a guy that was at one point in time like 500 plus pounds, right?
02:33:47.000 How big was Ethan when he was at his biggest?
02:33:50.000 But anyway, Jamie will find out.
02:33:53.000 Documented all of it.
02:33:55.000 Did it publicly because he was a fucking star.
02:33:57.000 He's a famous actor.
02:33:59.000 Lost all the weight and now looks great.
02:34:00.000 And did it through exercise and discipline and even was really open about the fact that he gained a lot of it back a couple times.
02:34:08.000 He went from 550 to 255. He did that.
02:34:12.000 He did that himself.
02:34:13.000 I mean, he did it, and he documented it, and he had to go through surgery to get the skin removed so that he wasn't like a flying squirrel.
02:34:23.000 But he did it.
02:34:24.000 And to say that it's all genetic.
02:34:28.000 Like, no, he had the same genes.
02:34:30.000 Like, this is the same guy.
02:34:31.000 It's not.
02:34:32.000 And it's also not inspiring.
02:34:34.000 Yes.
02:34:35.000 But that's the tension.
02:34:37.000 If you say it's all genetic or it's significantly genetic, then you're encouraging people to be more accepting of the challenges of other people's lives.
02:34:47.000 Everybody's walking a hard road is basically the philosophical thing.
02:34:52.000 Just because it's easy for you to exercise doesn't mean it's easy for others to exercise.
02:34:57.000 Sort of.
02:34:57.000 But aren't they also saying you don't even have to walk that road because it's not going to help you.
02:35:01.000 Exactly.
02:35:01.000 So that's a very poor statement of that.
02:35:03.000 It's a trade-off.
02:35:04.000 I mean, it's a different philosophy.
02:35:06.000 Pull yourself up by your bootstraps is a really inspiring, powerful, empowering philosophy.
02:35:11.000 But it's like, yeah, well, sometimes it's harder.
02:35:14.000 You can't.
02:35:15.000 You can't say that because different people have different...
02:35:18.000 Some people don't even have fucking shoes.
02:35:19.000 The idea of pull yourself up and your bootstraps is stupid.
02:35:22.000 And the, I did it, why didn't you?
02:35:24.000 No, you did it with your life.
02:35:25.000 The idea that your life, because it was difficult, is exactly the same as somebody else's life, which may be more difficult or have insurmountable obstacles that are in the way.
02:35:34.000 There's also different temperaments, different mental fortitude that people are just, for whatever reason, from the womb have.
02:35:44.000 Some people are just determined from the time they're really young, and some people are just not.
02:35:49.000 Some people are discouraged easily and some people are not.
02:35:52.000 And I don't know why.
02:35:53.000 But to say that there's no way is crazy.
02:35:57.000 To say there's no way is like that's irresponsible.
02:35:59.000 And it's also like to say that and just put it on 60 Minutes.
02:36:03.000 Hey guys, that's not true.
02:36:05.000 And you could talk to a lot of people that have lost weight and they'll tell you it's not true.
02:36:08.000 It doesn't mean that the people who are obese didn't get a really bad hand genetically, a really bad hand in terms of the environment they grew up with.
02:36:16.000 Yeah, they got dealt a bad hand.
02:36:18.000 No doubt.
02:36:19.000 It's not the same as someone who grows up in a house where everybody's skinny and the fucking whole family runs.
02:36:23.000 Like, no, it's not going to be the same.
02:36:25.000 Someone who's eating organic and the whole family does a lot of exercise and does stuff together.
02:36:31.000 Yeah, they're going to be thinner.
02:36:32.000 Yeah.
02:36:33.000 But you can't lie.
02:36:35.000 You can't lie.
02:36:37.000 And you can't be a fucking...
02:36:38.000 You can't expect me to think that you're really an expert when you say things like that.
02:36:42.000 Yeah, but you also can't criticize Bill Gates by saying he has a soft body.
02:36:47.000 Of course you can.
02:36:47.000 Of course you'd be a comedian, but...
02:36:48.000 But he does.
02:36:49.000 And if he's talking about health, hey, buddy, get your house in order first.
02:36:53.000 Yeah.
02:36:54.000 Yeah, definitely.
02:36:55.000 There's a lot of incredible doctors that don't have their house in order.
02:36:58.000 That's true.
02:36:59.000 But if you're giving health advice, one of the core components to health is your metabolic health.
02:37:04.000 Your overall metabolic health.
02:37:05.000 I'll put up this story.
02:37:06.000 Right out of the gate, they're talking about using that drug, that semaglutide.
02:37:10.000 Oh, no.
02:37:11.000 Is that what they're doing?
02:37:12.000 So this is like an ad for semaglutide?
02:37:14.000 I'm not saying that, but that's what it seems like.
02:37:16.000 Oh my god.
02:37:17.000 Because there's even the whole thing about the shortages of it.
02:37:19.000 Oh no.
02:37:20.000 You know, I think Huberman was discussing this.
02:37:24.000 He might have been discussing this with Peter Attia.
02:37:26.000 And they were discussing that semaglutide doesn't just make you lose fat, but also makes you lose muscle in many cases.
02:37:36.000 And that's probably because you're not taking in enough food.
02:37:40.000 Right?
02:37:41.000 Because what it's doing is, I'm just guessing.
02:37:43.000 There might be some other mechanisms involved, obviously.
02:37:45.000 But if you're, like, full quicker, which is the idea behind this stuff, it's almost like you're taking an injection that does the same thing as, like, a belly band.
02:37:55.000 I mean, if you're not eating enough food and you're losing fat that quickly, you might be losing muscle, too.
02:38:01.000 Because when people go on binge diets and they starve themselves, they lose muscle.
02:38:05.000 When guys lose weight for fights and they get down to a very minimal calorie input, they lose muscle, too.
02:38:14.000 Like when someone cuts themselves down from like 205 and fights at 170, 100% they're going to lose some muscle too.
02:38:21.000 Yeah, but there's an interesting, so fighting is different, but if you're doing it in a healthy way for your own personal life, there might be some strength training combined.
02:38:31.000 I mean, that's a really interesting dynamic, right?
02:38:33.000 How do you lose weight while maintaining muscle mass?
02:38:35.000 Well, it depends on how many calories you're burning.
02:38:39.000 Much of the weight loss resulting from GLP-1 agonists is the loss of muscle, bone mass, and other lean tissue rather than body fat.
02:38:49.000 Holy shit.
02:38:50.000 It's dark.
02:38:50.000 For example, a 2021, but at least you look good.
02:38:54.000 2021 trial entitled The Impact of Semaglutide on Body Composition in Adults with Overweight or Obesity that included pre- and post-treatment DEXA scans.
02:39:04.000 DEXA is a medical imaging test used to assess body composition and bone density.
02:39:09.000 It's one of the most Accurate methods for identifying how much body fat a person has versus fat-free mass, such as muscle and bone mass.
02:39:18.000 34.8% of the total weight loss experienced by participants receiving semaglutide resulted from muscle, bone, and connective tissue.
02:39:29.000 Oh shit, that's not good.
02:39:30.000 So that'll make your ligaments weaker?
02:39:32.000 Your fucking knees weaker and shit?
02:39:34.000 That's connective tissue.
02:39:35.000 I wonder if those people were working out or all, do you know what I mean?
02:39:36.000 They could've just been sitting around thinking they didn't have to do anything and now they lost a bunch of fucking muscle mass.
02:39:41.000 Good point.
02:39:41.000 Yeah, but 35%.
02:39:43.000 That's a lot.
02:39:44.000 Muscle, bone, and connective tissue.
02:39:46.000 That's not good.
02:39:47.000 That's not good.
02:39:48.000 We're good to go.
02:40:10.000 Comparing two programs for weight loss in a population of experienced athletes, the slow reduction group lost 5.6% of their total body weight with 100% of the loss coming from body fat.
02:40:21.000 So they did it slow, did it nice and scientifically, while simultaneously gaining 2.1% lean mass.
02:40:29.000 So that's showing you it's way better to do it the right way.
02:40:32.000 If 35% to 40% of total weight loss comes from lean tissue, such as observed in many recent GLP-1 agonist trials, it would be disastrous for an athlete's strength, endurance, and performance levels, and I would say resistance to injuries, too.
02:40:46.000 Because if it's saying that it's breaking down your connective tissue, that would be disastrous for knees and shoulders and necks and all that other shit.
02:40:56.000 Well, that goes to the fact that this incredible biological system we have is very hard to understand.
02:41:01.000 For now.
02:41:02.000 We'll fix that.
02:41:03.000 We'll fix that.
02:41:04.000 Well, that's what...
02:41:04.000 You know what?
02:41:05.000 What if you do that with steroids?
02:41:06.000 Maybe it'll fix it.
02:41:08.000 What?
02:41:08.000 They do that with, like, fucking Anadrol 50 or some crazy shit.
02:41:11.000 There's a lot of negative side effects, right?
02:41:14.000 Yeah, we gotta counter that with another pill.
02:41:16.000 Another pill?
02:41:16.000 Yeah, we keep counting.
02:41:17.000 You should be the head of a Pfizer.
02:41:19.000 I should.
02:41:20.000 I got ideas.
02:41:21.000 Yeah.
02:41:22.000 Fuck it all from Pfizer.
02:41:24.000 I was getting asked for something like that.
02:41:25.000 Fix the whole problem.
02:41:26.000 I can't remember what it is, but I know people have said it fucks with their brain, so it's like, well, take this brain pill so that it blocks that, and now this thing will fuck with your brain.
02:41:34.000 I forget exactly what it was.
02:41:34.000 How is it fucking with their brain?
02:41:36.000 I wish I remembered what it was.
02:41:37.000 I got an ad for this.
02:41:39.000 Google mental side effects of semaglutide.
02:41:42.000 But it makes sense.
02:41:43.000 There's no biological free lunch, right?
02:41:45.000 When it comes to these complex systems, you're tricking it into losing weight.
02:41:50.000 Of course you're going to lose some shit you want to keep.
02:41:53.000 Most common, anxiety, darkened urine.
02:41:56.000 I like a dark urine.
02:41:58.000 I got a whiskey color.
02:42:00.000 Yeah, yeah, yeah.
02:42:00.000 Aged.
02:42:01.000 Well.
02:42:02.000 It's a complex.
02:42:03.000 Headaches.
02:42:06.000 Large, hive-like swelling on the face, eyelids, lip, tongue, throat, hands, feet, legs, or sex organs.
02:42:13.000 Jesus Christ!
02:42:15.000 Hive-like swellings.
02:42:16.000 Maybe it's not.
02:42:17.000 Maybe it's only on your dick, but they want to, like, scare you.
02:42:19.000 All this other stuff, too.
02:42:22.000 Nightmares.
02:42:23.000 Oh, Jesus.
02:42:23.000 Wait, what do you mean?
02:42:24.000 Wait, is it better if it's on your dick?
02:42:26.000 I don't understand.
02:42:26.000 No, no, no.
02:42:26.000 It's terrible.
02:42:27.000 But they're like, you're going to get it everywhere.
02:42:29.000 Don't worry.
02:42:30.000 It's not just the dick.
02:42:30.000 Oh, right, right.
02:42:30.000 It's softening it up.
02:42:31.000 Yeah, it's your head, your feet.
02:42:33.000 You're like, okay, okay, okay.
02:42:34.000 But if it's like, can it give you hives all over your balls?
02:42:38.000 No!
02:42:39.000 Consequences just for your dick.
02:42:40.000 No, no, no.
02:42:40.000 Your face too, bro.
02:42:41.000 Don't worry.
02:42:42.000 Your hands.
02:42:42.000 Okay, okay, okay.
02:42:43.000 But I'll be skinny, right?
02:42:45.000 Pain in the stomach, side, or abdomen.
02:42:47.000 Possibly radiating to the back.
02:42:49.000 Skin rash.
02:42:51.000 Unusual tiredness or weakness.
02:42:53.000 Yeah, makes sense.
02:42:54.000 If you're not eating much, you're gonna be tired.
02:42:57.000 Unless you're taking our Adderall.
02:42:59.000 What is it?
02:43:00.000 Oh, it may cause some people to have suicidal thoughts and tendencies or to become more depressed.
02:43:05.000 Oh, Jesus Christ.
02:43:07.000 Also tell your doctor...
02:43:08.000 Go back to that again.
02:43:09.000 Also tell your doctor if you have sudden or strong feelings.
02:43:14.000 Including feeling nervous.
02:43:15.000 That's my whole life.
02:43:16.000 Sudden strong feelings.
02:43:18.000 Angry.
02:43:18.000 It's so interconnected.
02:43:20.000 Violent.
02:43:21.000 Restless, violent, or scared.
02:43:23.000 If you or your caregiver notice any of these side effects, tell your doctor right away.
02:43:28.000 You tell your doctor.
02:43:28.000 Your doctor says, don't be a pussy.
02:43:30.000 Do you want to have a fucking six-pack for summer or not, Lex?
02:43:33.000 Take that rash on your dick.
02:43:34.000 Come on, bro.
02:43:36.000 Did Elon say he was taking that stuff?
02:43:39.000 I think he might have.
02:43:42.000 Google that.
02:43:43.000 I think he said he was dropping weight.
02:43:47.000 I think a lot of people are dropping weight with that.
02:43:49.000 Yeah, no, I mean his big thing he changed is fast thing.
02:43:53.000 Oh, that's a good move.
02:43:55.000 What is it saying?
02:44:00.000 What's it saying?
02:44:02.000 Down 30 pounds.
02:44:03.000 Okay, World's Richest Man responded saying, fasting plus, yeah, Ozempic, plus Wegovy, yeah.
02:44:10.000 That's it, that's semaglutide, plus no tasty food near me.
02:44:15.000 Yeah, no snacking, fasting.
02:44:17.000 Yeah.
02:44:19.000 That's, I think, a really good way to lose weight without the drug.
02:44:22.000 He says, bro, I also take Ozempic for my diabetes.
02:44:25.000 Did he say that?
02:44:27.000 No, someone else did.
02:44:28.000 Who said that?
02:44:28.000 Another user asked if his diabetes is a drug.
02:44:30.000 This is the state of modern journalism.
02:44:32.000 Oh my god.
02:44:34.000 Look at this.
02:44:34.000 Someone's tweet gets fucking locked in there.
02:44:37.000 Isn't that funny?
02:44:38.000 That's what they do with articles now.
02:44:41.000 They'll just take some random crackpot who makes a tweet to Lex.
02:44:46.000 If they're writing a hit piece on Lex, they're like, Lex, time to get rid of that stupid fucking suit.
02:44:51.000 A lot of people are very upset about his suit.
02:44:53.000 And they're like, they'll add that to the article.
02:44:57.000 And nobody's bothered by it.
02:44:59.000 Nobody's really bothered because there's no mechanism to fight it, to fight that state of journalism.
02:45:03.000 Well, it's also no one's reading that.
02:45:05.000 That's the other thing.
02:45:06.000 The impact that these things have as opposed to if it was on the front page of the Boston Herald.
02:45:12.000 It used to be if the New York Times had a story, that was where you got your information from, on the front page of the New York Times.
02:45:19.000 Now it's like, who's reading that?
02:45:21.000 Well, anonymous accounts or just people on Twitter that are trying to create drama will cite that as a source.
02:45:28.000 That's true.
02:45:29.000 So it's losing power, for sure, but we're in this transitionary phase where these magazines still have power.
02:45:38.000 New York Times still have prestige and authority.
02:45:41.000 So if it's written there, even though they're misusing that authority, because New York Times, online, there's a huge number of articles that I don't know, Forbes, I don't know any of this.
02:45:51.000 But there's just a huge number of articles.
02:45:53.000 They're using the prestige of that title, and they can write whatever the heck they want.
02:45:57.000 And they're incentivized to write the most dramatic possible thing.
02:46:03.000 I mean, I'm hopeful about ChatGPT replacing all of that, because it'll just be able to automatically generate a bunch of bullshit to where we'll realize it's all...
02:46:12.000 Bullshit.
02:46:13.000 And then we look to actual authentic human beings for our sources of information versus organizations with a nice pretty logo.
02:46:19.000 Yeah, because all you would have to do is hire someone to write that hit piece and then cite that hit piece in a bunch of accounts.
02:46:25.000 And then these fake accounts have their take on this hit piece.
02:46:30.000 And what do you think?
02:46:32.000 You're close to this.
02:46:33.000 What do you think the number of actual bots are in terms of social media?
02:46:38.000 Let's just say Facebook, Twitter.
02:46:40.000 I mean, they found out about Facebook that...
02:46:42.000 Top 20 Christian sites.
02:46:44.000 Sorry if I keep bringing this up, but I thought it was fascinating.
02:46:46.000 19 of the top 20 were Russian troll forums.
02:46:49.000 Top 20 Christian sites.
02:46:51.000 So all these people are going there for, like, you know, Christian news or whatever, and they're, you know, they're steering people.
02:46:58.000 They're getting people riled up.
02:46:59.000 They're putting things out there.
02:47:01.000 They're getting people fired up about stuff.
02:47:03.000 They're organizing things.
02:47:04.000 I think currently AI, it's actually very difficult to build an army of bots that looks human-like.
02:47:13.000 I think currently it's just cheaper, like if I wanted to control- But I don't think it necessarily has to be an army of bots.
02:47:18.000 Yeah, right, right.
02:47:18.000 So I think if I was like- Troll farms.
02:47:21.000 A rich person, yeah, a troll farm, I would hire a bunch of people.
02:47:24.000 That act as bots, essentially.
02:47:27.000 Not act as bots, but they have multiple accounts and they control different, and they get really good at controlling the conversation in the way that steers you towards a particular narrative.
02:47:36.000 I can honestly do it with a team of 10 people, probably, control a narrative of a particular...
02:47:44.000 I don't want to say anything.
02:47:45.000 On the election.
02:47:47.000 I've seen this time and time again where you seed the idea and then the rest of the humans that seek drama, it spreads.
02:47:56.000 I don't know exactly how to fight that.
02:47:58.000 I still have the hope that you could do the same kind of thing with the love bot army.
02:48:03.000 I honestly think that a lot of that can be fought with just developing critical reason in people.
02:48:12.000 Yes.
02:48:14.000 Basically showing to people, revealing to them that there's a lot of misinformation online and only you can figure it out using the capacities of your own reason.
02:48:28.000 Diversifying the sources of news that you take in and all of that.
02:48:32.000 I think you're right.
02:48:34.000 But how many people are going to develop that critical reasoning?
02:48:39.000 You have it.
02:48:40.000 You have it, right?
02:48:40.000 You read something and some people are tweeting about stuff and you're like, is that a real person?
02:48:44.000 Yeah.
02:48:44.000 Like, you'll think about it.
02:48:45.000 You'll take it into consideration.
02:48:47.000 Do you remember when Elon first bought Twitter and there was this...
02:48:50.000 And he posted it, I think, too.
02:48:52.000 A bunch of people posted it.
02:48:53.000 There was someone that did a comparison of this one phrase that was said by so many people about, is it really right for one person to have this much power?
02:49:02.000 And it was just like, oh, these accounts.
02:49:04.000 Saying it in the exact same order.
02:49:06.000 Exact same words.
02:49:08.000 And it's fascinating because, like, people are putting that narrative out there.
02:49:11.000 So who's doing that?
02:49:11.000 Is that a troll farm?
02:49:12.000 Is it a bot?
02:49:13.000 Does it matter?
02:49:14.000 Because it's clearly there's something going on, right?
02:49:17.000 And I would do that if I didn't want him to do it.
02:49:20.000 If I was, like, some competitor or if I was some organization that, you know, was enjoying the benefits of it being censored and having some sort of interaction with the company to be like, hey, this story, we should fucking kill it.
02:49:35.000 And then you knew you could just get it killed.
02:49:38.000 But there's other forces like writing a negative thing about Elon, a tweet or an article is more likely to produce likes.
02:49:47.000 That's the current state of things, especially ever since, and I criticize him for this, becoming political.
02:49:53.000 He didn't need to be political.
02:49:55.000 Well, it's not just political.
02:49:56.000 It's like we talked about the Paul Pelosi tweet, right?
02:50:00.000 It's like when you're that guy, God, you'd be extra, extra, extra, extra careful.
02:50:05.000 But he wouldn't be that guy if he was careful, right?
02:50:07.000 Right.
02:50:07.000 That's the trade-off.
02:50:08.000 Which is the trade-off.
02:50:09.000 Say, fuck it.
02:50:10.000 When he made that picture of the pregnant man emoji right next to Bill Gates and said, if you want to lose a boner real quick, that's the same guy that wants to put people on the moon.
02:50:20.000 I mean, it's so crazy.
02:50:22.000 He wants to be a multi-planetary civilization and dunk on people.
02:50:25.000 The fact that it's the same person, it's like, there's not another dude like him out there.
02:50:30.000 And I mean, he's not a troll.
02:50:33.000 He's a part-time troll.
02:50:35.000 He's an incredible leader of teams.
02:50:38.000 He can hire better than anybody I've ever seen.
02:50:40.000 So build up a team, get rid of the people on the team that are not contributing effectively.
02:50:46.000 That's really rare, especially for large companies.
02:50:49.000 Well, when he moved into Twitter and did that, it was funny, like, at the outrage, but yet there was so much information out there, so much evidence that, like, there was a lot of waste there.
02:50:59.000 Like, I'm sure you saw the video of this woman who outlines her day at Twitter.
02:51:05.000 Like, I'm so blessed to work at Twitter.
02:51:07.000 Did you ever see that video?
02:51:09.000 Oh my god, we have to watch it.
02:51:10.000 Because it's amazing.
02:51:11.000 Because, like, it's barely, she's barely working.
02:51:14.000 She's probably advertising how great the life is, not even understanding that that's not supposed to be the life of a tech person.
02:51:24.000 Well, we think of what he does, right?
02:51:27.000 The grind.
02:51:28.000 Sleeping on the office floor, really trying hard to solve these problems, demanding that sort of work ethic from all the people that he works with.
02:51:36.000 But watch this video because it's...
02:51:38.000 It's so hilarious.
02:51:41.000 As a Twitter employee, so this past week went to SF for the first time at a Twitter office, badged in, honestly took a moment to just soak everything in.
02:51:52.000 What a blessing.
02:51:53.000 Also started my morning off with an iced matcha from the perch.
02:51:56.000 Then I had a meeting, so quickly scheduled one of these little pod rooms, which were so cool.
02:52:03.000 They're literally noise canceling.
02:52:04.000 Took my meeting, got ready for a bunch.
02:52:07.000 Look how delicious this food looks.
02:52:09.000 By the way, no slight to this lady.
02:52:11.000 What a great job.
02:52:12.000 I love it.
02:52:14.000 I'd want to work there.
02:52:15.000 I don't know what this is, but it was really cool.
02:52:18.000 Played some foosball with my friends.
02:52:19.000 She played some foosball.
02:52:23.000 Also found this really cool meditation room.
02:52:26.000 Oh, you gotta meditate.
02:52:27.000 How'd you lose a foosball?
02:52:28.000 Let's meditate.
02:52:29.000 I didn't do any yoga.
02:52:31.000 Oh, yoga!
02:52:32.000 If you were a yogi, so I also thought that was really cool.
02:52:35.000 Pretty cool.
02:52:36.000 Had a couple more meetings in the afternoon.
02:52:39.000 Had a ton of projects that we needed to knock out.
02:52:41.000 So, hi to my teammates.
02:52:43.000 Went to the library to kind of get some more work done.
02:52:47.000 Obviously, had to have our afternoon coffee.
02:52:49.000 This is hard to watch.
02:52:50.000 So, made some espresso.
02:52:52.000 Espresso!
02:52:53.000 Had some red wine that's on tap.
02:52:57.000 Red wine on tap.
02:52:57.000 Let's get fucked up, Lex.
02:52:59.000 That's the first good thing I see.
02:53:01.000 Let's go!
02:53:02.000 And look, on the roof.
02:53:03.000 So, awesome trip.
02:53:05.000 This is Twitter.
02:53:06.000 I experienced the same thing at Google.
02:53:09.000 I was at Google for a bit, and I showed up wanting to...
02:53:14.000 I mean, you know me.
02:53:16.000 Yeah, I know you.
02:53:16.000 100 hours a week, Lex.
02:53:20.000 Regina Dugan, I don't know if I should say this, but she's an incredible woman.
02:53:23.000 She's my boss.
02:53:25.000 I won't say how many hours she said, but she made me feel like I'm going to grind here.
02:53:31.000 This is going to be great.
02:53:33.000 And I showed up and everybody was like, there's a meditation room, there's nap pods, there's free food, and everyone's relaxed.
02:53:40.000 I don't want to criticize that because I think it's possible to do the grind in a healthy way.
02:53:45.000 Not like this, but in a healthy way.
02:53:48.000 Right, you take some time off every now and then.
02:53:50.000 Well, not even now.
02:53:51.000 It depends on the job.
02:53:53.000 I think programmers usually put in more hours.
02:53:55.000 But even for programming, to be effective, really, four hours a day is probably two to four hours a day is when you're really focused and really being productive.
02:54:07.000 Everything else is filler.
02:54:09.000 Like a writer.
02:54:10.000 Yeah, like a writer.
02:54:11.000 It's similar to that.
02:54:12.000 So whatever makes that happen, I think is good.
02:54:18.000 But the danger is creating a culture where you're having fun.
02:54:24.000 Everything is great.
02:54:25.000 There's food.
02:54:26.000 Google thinks and a lot of these companies think that let's make our employees happy.
02:54:30.000 So they want to stay here a long time.
02:54:33.000 And then the good productive ones will do awesome stuff.
02:54:38.000 Do you think that's why there's these mass layoffs that they realize like this is not effective?
02:54:42.000 No, they do this mass layoffs regularly.
02:54:44.000 When the economy goes down, they hire a bunch of people, let's create a happy space.
02:54:48.000 Now, like, the thing is, with Google and all these companies, they're often bringing in a huge amount of money, so there's not a deadline.
02:54:55.000 Like, Twitter was actually going bankrupt.
02:54:57.000 It was going towards the negative.
02:55:00.000 Before Elon took over?
02:55:01.000 Before Elon bought it, yeah.
02:55:02.000 So they were fucked?
02:55:03.000 Well, not, I think...
02:55:04.000 In trouble?
02:55:05.000 In trouble, yeah, yeah.
02:55:06.000 Like, I don't know exactly...
02:55:07.000 Did he know that when he bought it?
02:55:09.000 I think so.
02:55:12.000 Maybe not the full extent of it.
02:55:13.000 What a wild move.
02:55:15.000 Yeah.
02:55:16.000 What a wild move.
02:55:17.000 Step in and buy a social media corporation, the number one distributor of information.
02:55:23.000 Yeah.
02:55:24.000 I mean, if not the number one, maybe Facebook's number one.
02:55:28.000 Fundamental is a software engineering company, so he's rolling in there, like, not knowing anything about the teams, not anything.
02:55:35.000 It's probably just, like, waking up one day and just saying, you know what?
02:55:39.000 Like, it was probably had to do...
02:55:40.000 I haven't talked to him, actually, about this.
02:55:42.000 Like, what was the...
02:55:43.000 Like, why was he in a mood that he said, I'm going to buy Twitter?
02:55:47.000 But it probably had to do with, like...
02:55:50.000 Certain features not working well.
02:55:52.000 It's like, why are they not innovating?
02:55:54.000 Because he really likes Twitter.
02:55:56.000 He's like, this is a pretty cool website.
02:55:59.000 Like, why are they not innovating?
02:56:01.000 I'm making this thing better.
02:56:02.000 I think there is also this issue of censorship.
02:56:06.000 I think that's a big issue with him.
02:56:07.000 I don't think that's a cursory.
02:56:10.000 I think he, I mean, one of his statements was that if they can't pull it off, like, democracy might be doomed.
02:56:17.000 I wonder.
02:56:18.000 I wonder.
02:56:19.000 I mean, where does it go if there is complete control of a narrative and then it becomes untanglable, right?
02:56:28.000 Like if there's complete control of a narrative and information, it's actually controlled by the central power.
02:56:35.000 It's controlled by the government.
02:56:36.000 It's controlled by whoever's in office, by the intelligence agencies which never leave office.
02:56:42.000 If that becomes how all of our information gets out to us, That is – it's a very – you would hope that they would do a great job in being fair and balanced and telling us the truth about everything and just keeping us from bad information.
02:56:57.000 But if you go over the history of not just this country but of every country, there's been times where they've done things that are contrary to the interest of the public and they've done it measurably.
02:57:08.000 You could see it.
02:57:09.000 You get freedom of information files on all kinds of things that the government has done that people are very, very unhappy with.
02:57:15.000 If they can control a narrative and – That's fucking dangerous.
02:57:18.000 And him being in control of Twitter, as much as the cucks freak out, at the end of the day, at least you have one pathway for information where you get to see things debated and disputed.
02:57:33.000 And that's just not the case in the other ones.
02:57:36.000 Well, there's a problem.
02:57:37.000 I mean, power corrupts.
02:57:39.000 An absolute power corrupts absolutely.
02:57:40.000 Elon can be corrupted.
02:57:42.000 And so he's been attacked by the left aggressively, which is part of the reason he's now leaning right, if hard to guess.
02:57:49.000 I'm not a therapist.
02:57:50.000 But now he's leaning right, I believe, more than he's comfortable with because of the intense attacks from the left.
02:57:58.000 So it's like a vicious cycle.
02:58:00.000 But you can convert the bias that Twitter previously had into the other direction.
02:58:06.000 Either the left or the right are, they're both susceptible to the corrupting nature of power.
02:58:12.000 So I think the bigger thing is, the bigger issues, what I think Jack Dorsey has talked about, is putting the power of censorship into the hands of the company is the problem.
02:58:24.000 So you have to somehow remove it.
02:58:25.000 You have to distribute.
02:58:27.000 You have to outsource, remove the censorship, like leave it up to the people to censor themselves.
02:58:32.000 Meaning to control what kind of people I want in my life, on social media, who am I interacting with.
02:58:39.000 Don't have a centralized committee and meeting that censors.
02:58:42.000 Because you're always not into trouble there.
02:58:46.000 And you see that now with even Twitter.
02:58:48.000 There's questionable decisions being made now in the other direction also.
02:58:54.000 But in general, it seems like there's cherry picking of who gets banned and not.
02:59:01.000 That's always going to be the case if it's centralized.
02:59:04.000 It's going to probably be better with Elon because he's more allergic to bullshit than others, but any centralized power is going to get corrupted.
02:59:15.000 Yeah, interesting.
02:59:17.000 Isn't Jack Dorsey working on some sort of a decentralized version of social media?
02:59:22.000 Yeah, he has been for a while.
02:59:23.000 I think he launched some stuff.
02:59:24.000 He has interesting ideas.
02:59:25.000 But I think he really believes in fully decentralized.
02:59:28.000 Yeah.
02:59:28.000 I think there's some centralization which is really important to create a product that's awesome to use.
02:59:36.000 You want a benevolent leader?
02:59:38.000 Well, no.
02:59:39.000 Yes, but a benevolent leader that sets the mission and so on and hires and everything like that.
02:59:44.000 But certain stuff like censorship, you have to outsource it.
02:59:48.000 You have to make it distributed.
02:59:49.000 But should you even have it at all?
02:59:51.000 Yes, because it's freedom of speech versus freedom of reach.
02:59:56.000 I don't want a person with a megaphone screaming.
03:00:01.000 I want to be able to choose not to listen to the screaming guy.
03:00:04.000 Right.
03:00:05.000 My life is happy.
03:00:07.000 There's a podcast I can recommend.
03:00:08.000 Podcast show?
03:00:09.000 Intelligence Squared Debates.
03:00:11.000 There's a US version and there's a British version.
03:00:13.000 If you like British snark, that's a little better.
03:00:15.000 And they have pretty heated debates between each other.
03:00:19.000 I like that.
03:00:20.000 I like that kind of disagreement.
03:00:21.000 Those debates and high effort disagreement.
03:00:23.000 Yes.
03:00:24.000 Like, people just talking shit and mocking and trolls, like Jordan has talked about.
03:00:30.000 They destroy the quality of the conversation.
03:00:32.000 Now, I don't want to, like, remove them from the...
03:00:35.000 They should have a community.
03:00:37.000 If people want to say shitty things to each other, it's great.
03:00:39.000 But each individual person should be able to control to some degree how cool their party is.
03:00:44.000 Well, that's what Mastodon's trying to do, right?
03:00:47.000 Adam Curry talked about that.
03:00:49.000 You know, you can kind of control who's in your Mastodon server, and you can stop shitty people from interacting with you.
03:00:56.000 But the problem with Mestadon is that there's not enough centralization to create an awesome product.
03:01:00.000 As a product, it kind of sucks.
03:01:03.000 It's very difficult to set up.
03:01:04.000 Difficult to navigate.
03:01:05.000 Navigate, to use.
03:01:07.000 It's difficult to...
03:01:08.000 No person with a large following is going to use it for now.
03:01:13.000 For now.
03:01:14.000 Same with Rumble, too.
03:01:15.000 I would love for Rumble to be successful.
03:01:19.000 Is it successful yet?
03:01:20.000 I don't think so.
03:01:21.000 There's a few famous people that went on to Rumble.
03:01:23.000 They throw a lot of money around.
03:01:25.000 Yeah.
03:01:25.000 But ultimately, at the end of the day, you have to create a really nice product that competes with YouTube.
03:01:31.000 Not the content, but it's fun to use.
03:01:34.000 It's easy to use.
03:01:37.000 You can play in the background.
03:01:38.000 There's no bugs.
03:01:40.000 The recommendation works.
03:01:42.000 Everything just works.
03:01:42.000 YouTube is really easy to use.
03:01:44.000 The search, the discovery, the stuff that's apolitical.
03:01:47.000 The search discovery works great.
03:01:49.000 The comment system works great.
03:01:52.000 The actual interface works great.
03:01:53.000 Rumble, I think, has a way to go there.
03:01:56.000 But philosophically, Rumble just provides a nice resistance to the over-censorship that is YouTube, over-caution that is YouTube.
03:02:03.000 Yeah, I like the fact that there's alternatives.
03:02:06.000 Twitter might be that alternative.
03:02:08.000 The other thing about the argument for censorship is that if you do admit that there are some bots, or at the very least there's people that are hired to do certain things and to push a narrative on Twitter, if you allow that,
03:02:23.000 you could allow someone to game the system.
03:02:25.000 If you just have no moderation at all, you could most certainly, someone could come in and game the system and just flood, especially if your timeline is by time.
03:02:37.000 It's not...
03:02:38.000 Chronological.
03:02:40.000 It's not by an algorithm.
03:02:42.000 If you do that, man, shit, you could really just swarm something.
03:02:46.000 And keep like these posts coming in that have one narrative and one narrative only.
03:02:53.000 And if you're interested in a subject like what happened in blah, blah, blah in Cincinnati and you go there to the story and the narrative is promoted by people that have a vested interest in getting one version of the truth out.
03:03:04.000 Yeah, I trust in the people's general ability to figure out the different perspectives on a story and to figure out the truth from that.
03:03:14.000 You have to trust in that ability and try as much as possible to remove the low-effort bullshit.
03:03:20.000 So not the wrong bullshit, not the misinformation, but the mockery, the trolling, the bot stuff, all of that.
03:03:32.000 It's really difficult because there's a lot of gray areas.
03:03:35.000 There's a lot of, obviously, amazing humor online.
03:03:37.000 That's like mockery sometimes is one of the best ways to get to the truth.
03:03:40.000 Yeah.
03:03:41.000 I mean, that's Tim Dillon's whole existence, right?
03:03:44.000 So you have to be extremely careful with what is and isn't, but I think you have to put that power in the hands of individual users versus some kind of centralized entity.
03:03:56.000 Yeah.
03:03:56.000 It's complicated, right?
03:03:57.000 There's no simple, clear path towards a perfect environment.
03:04:02.000 But I think that's also part of what's going on is this weird struggle to kind of figure out how to do this correctly.
03:04:09.000 And that's where it's fascinating that a guy like Elon comes along, where you get this very wealthy and influential person that says, like, you can't just let it go this way.
03:04:16.000 Let's introduce this new element and try to figure it out in a way that I'm not listening to the intelligence agencies.
03:04:25.000 A bunch of weird shenanigans and release these Twitter files and allow these journalists to go over all this data.
03:04:31.000 Just that alone has been a massive service.
03:04:34.000 Jamie, you were saying something to me the other day about Russian bots that you think...
03:04:39.000 What about?
03:04:40.000 The Rene DiResta stuff.
03:04:44.000 Oh...
03:04:44.000 Do you remember?
03:04:45.000 That was with the Twitter file stuff.
03:04:46.000 Right.
03:04:47.000 They were saying all that stuff.
03:04:48.000 They were saying there are no Russian bots almost, I feel like.
03:04:51.000 What does that mean, though?
03:04:52.000 I don't know.
03:04:52.000 There's no Russian bots?
03:04:54.000 Is that what the Twitter file said?
03:04:56.000 The Twitter files, I think, said that there's not a significant influence on the election from them.
03:05:02.000 But that's Twitter, right?
03:05:04.000 Wasn't the...
03:05:05.000 Rene Duressis took a lot of it was Facebook.
03:05:07.000 Yeah.
03:05:08.000 Yeah.
03:05:09.000 I don't know, man.
03:05:10.000 It's interesting to watch it all get sorted out.
03:05:13.000 The dark one, for me, it's really difficult to know what to do with shadow banning.
03:05:17.000 That seems like deeply wrong.
03:05:19.000 It seems creepy.
03:05:21.000 What do you got there?
03:05:22.000 You gonna read something?
03:05:23.000 A poem.
03:05:25.000 I knew it.
03:05:26.000 I'm like, we're gonna wrap this up soon and this motherfucker's gonna bust out a poem.
03:05:29.000 Of course I am.
03:05:31.000 Yeah, the shadow banning.
03:05:32.000 You know what's really fucked?
03:05:33.000 Is that they lied.
03:05:35.000 That's what's really fucked.
03:05:37.000 It's not just they were shadow banning, it's that they commented on it and they lied.
03:05:41.000 They said it wasn't happening.
03:05:43.000 I mean, if you were shadow banning, would you tell the truth about it?
03:05:46.000 I think the right thing is to not shadow ban.
03:05:49.000 Yeah, the right thing is to not shadow ban, but the fact that they just openly lied about it.
03:05:53.000 Like, that's one of the fascinating things about Elon buying it, is we get to see that.
03:05:59.000 Yeah.
03:05:59.000 Like, no, they lie!
03:06:01.000 They just lie!
03:06:02.000 And it becomes a culture, that's a dark thing, that a culture at a company can make that not seem like a big deal.
03:06:09.000 Right.
03:06:10.000 It's important to preserve democracy, Lex.
03:06:12.000 Right.
03:06:13.000 And that's something I think about even with pharma companies.
03:06:17.000 The culture in general, the vision of Pfizer and so on, is to create medicine that helps a large number of people.
03:06:24.000 Boner pills.
03:06:24.000 Boner pills, primarily, yes.
03:06:26.000 What do you got there?
03:06:29.000 That's a poem.
03:06:30.000 Let's wrap this up.
03:06:33.000 This is a lot of fun, man.
03:06:35.000 I was trying to get close to the truth when I mentioned Pfizer.
03:06:37.000 I think you got close to the truth.
03:06:38.000 No, I'm just kidding.
03:06:38.000 You want to go more?
03:06:39.000 No, I don't want to go more.
03:06:40.000 You keep talking about Pfizer if you like.
03:06:44.000 Bukowski, of course.
03:06:49.000 I find myself disagreeing with him a lot lately.
03:06:52.000 Of course.
03:06:53.000 You're not a drunk and a loser.
03:06:54.000 Well, you know that letter he wrote to a friend, find what you love and let it kill you?
03:07:02.000 Yeah.
03:07:02.000 That line.
03:07:03.000 Yeah.
03:07:05.000 Yeah, I think he was referring to love there, like a romantic partner.
03:07:10.000 I think that's something I also, depending on the day, I disagree with, I agree and disagree with that.
03:07:17.000 So basically, find the thing you're passionate about and let it destroy you.
03:07:20.000 Well, one of the reasons why he resonates, obviously I'm joking around about him being a loser.
03:07:25.000 He's a very successful guy.
03:07:26.000 But one of the things that resonates with him with a lot of people is the pain.
03:07:31.000 Like there's something in his writing where his pain and his frustration, it's very tangible.
03:07:43.000 And it resonates with people.
03:07:45.000 You feel it.
03:07:45.000 You feel it in his writing.
03:07:47.000 And then when you see him, you realize who he was.
03:07:50.000 That's all real.
03:07:51.000 You ever see when he gets mad at people in the audience and he's yelling at them in the audience in some of his readings?
03:07:57.000 He's drunk and he's just reading off of his poetry.
03:08:01.000 Fascinating guy.
03:08:02.000 And the authenticity.
03:08:04.000 There's something deeply authentic about that.
03:08:06.000 Right.
03:08:06.000 That's not an act.
03:08:07.000 That's who he is.
03:08:08.000 Well, so that's what this poem is about.
03:08:10.000 It's called So You Want to Be a Writer.
03:08:12.000 It's a bit aggressive.
03:08:13.000 Okay.
03:08:16.000 If it doesn't come bursting out of you in spite of everything, don't do it.
03:08:20.000 Unless it comes unasked out of your heart and your mind and your mouth and your gut, don't do it.
03:08:26.000 If you have to sit for hours staring at your computer screen or hunched over your typewriter searching for words, don't do it.
03:08:33.000 If you're doing it for money or fame, don't do it.
03:08:36.000 If you're doing it because you want women in your bed, don't do it.
03:08:40.000 If you have to sit there, rewrite it again and again, don't do it.
03:08:44.000 If it's hard work just thinking about doing it, don't do it.
03:08:47.000 If you're trying to write like somebody else, forget about it.
03:08:51.000 If you have to wait for it to roar out of you, then wait patiently.
03:08:55.000 If it never does roar out of you, do something else.
03:08:58.000 If you first have to read it to your wife, to your girlfriend, or your boyfriend, or your parents, or to anybody at all, you're not ready.
03:09:07.000 Don't be like so many writers.
03:09:09.000 Don't be like so many thousands of people who call themselves writers.
03:09:13.000 Don't be dull and boring and pretentious.
03:09:16.000 Don't be consumed with self love.
03:09:18.000 The libraries of the world have yawned themselves to sleep over your kind.
03:09:23.000 Don't add to that.
03:09:25.000 Don't do it.
03:09:26.000 Unless it comes out of your soul like a rocket.
03:09:29.000 Unless being still will drive you to madness or suicide or murder, don't do it.
03:09:34.000 Unless the sun inside you is burning your gut, don't do it.
03:09:38.000 When it is truly time, and if you have been chosen, it will do it by itself, and it will keep on doing it until you die or it dies in you.
03:09:48.000 There's no other way, and there never was.
03:09:54.000 Boom.
03:09:54.000 Find what you love and let it kill you, Joe Rogan.
03:09:57.000 Thank you, brother.
03:09:57.000 This was so much fun.
03:09:59.000 We gotta do this more often, man.
03:10:00.000 We live in the same town.
03:10:01.000 Every time we do it, I always say, God, I fucking love this guy.
03:10:04.000 Yeah, I love you too.
03:10:05.000 So much fun.
03:10:06.000 So much fun to talk to you.
03:10:07.000 Appreciate you very much, man.
03:10:08.000 And your show was amazing, by the way.
03:10:09.000 It's my favorite show to watch on YouTube.
03:10:11.000 Nah, stop it.
03:10:12.000 I love it.
03:10:12.000 You're really fantastic at interviewing people and talking to people.
03:10:16.000 You're really good at it.
03:10:17.000 And you're very balanced in your approach.
03:10:19.000 You're a really good listener.
03:10:20.000 Really good at letting people express themselves.
03:10:23.000 Thank you, buddy.
03:10:23.000 Appreciate it very much.
03:10:24.000 I love you, man.
03:10:25.000 I love you, too.
03:10:25.000 Bye, everybody.