The Joe Rogan Experience - August 11, 2022


Joe Rogan Experience #1855 - Chris Best


Episode Stats

Length

2 hours and 34 minutes

Words per Minute

173.46112

Word Count

26,846

Sentence Count

1,950

Misogynist Sentences

6

Hate Speech Sentences

13


Summary

In this episode of the Joe Rogan Experience podcast, I speak with the founder of Subscramack, a website that helps journalists get their work out there and find outlets for their ideas. We talk about how he got started, why he started it, and what it s like to be a writer in the 21st century. And we talk about why he thinks there s a dark side to the internet, and how to deal with it. It s a great episode, and I hope you enjoy it. If you like what you hear, please HIT SUBSCRIBE on Apple Podcasts or wherever else you get your podcasts, and don t forget to leave us a rating and a review! It helps us to keep bringing you quality, high-quality episodes like this one every Monday morning. Tweet me if you liked it and let us know what you thought of it! Timestamps: 4:00 - How to win at the game 5:30 - What's the best way to get your message out there? 6:20 - What do you think of the current state of the internet? 7:00 8:40 - How do you feel about the current trends? 9:15 - What are your thoughts on the internet as a place of power? 10:15 11:30 What's your favorite thing about the internet right now? 12:00 | What are you're working on? 13:30 | What s your biggest takeaway from this episode? 15:40 | What would you like to write about? 16:30 // 17:15 | How do I would like to see more of it? 17: What s the best thing I m working on in the future of the future? 18:20 | What is your biggest challenge? 19:40 22:00 // Is there a better place for me? 21:00 / 22:10 | How can I win at this podcast? 26:10 27:30 What s my favorite thing that I m trying to do in the world? 25:00 + 27: What do I think about the most important thing you can do in my life? ? 28: What are my biggest takeaway? 35:00 & 27:20 29:30 Is it possible to be good in the face of the other people?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:13.000 What's up, Chris?
00:00:14.000 How are you?
00:00:14.000 Good.
00:00:15.000 What's going on?
00:00:17.000 Have you done podcasts before?
00:00:19.000 Nothing like this.
00:00:20.000 I've done a few.
00:00:21.000 Yeah?
00:00:21.000 Okay.
00:00:22.000 Cool.
00:00:23.000 So tell me, first of all, tell me what was the inspiration to start Substack?
00:00:29.000 How did it come about?
00:00:32.000 I've always been an avid reader.
00:00:35.000 My dad was an English teacher growing up.
00:00:36.000 We had books around the house.
00:00:38.000 And I've always thought that what you read matters.
00:00:42.000 Like it shapes who you are.
00:00:43.000 It shapes how you think.
00:00:44.000 It creates like who you are as a person.
00:00:47.000 And so great writing matters a lot.
00:00:52.000 In my other life I do software.
00:00:54.000 Software is this magical thing where you can write a piece of code and it does something for a million people.
00:00:59.000 If you write a great essay, a great book, a great thought, you can change who a million people are.
00:01:04.000 And so great writing is this valuable thing.
00:01:07.000 And when I took a sabbatical from a company that I'd done, I was like, I should be a writer.
00:01:12.000 That would be good.
00:01:13.000 Like, how hard could it be?
00:01:15.000 These guys are doing good things.
00:01:16.000 And I started writing what I thought was going to be like an essay or a blog post or a screed or something, outlining my frustration with the state of the media industry, the state of incentives on the internet, basically complaining, wah, wah, wah, social media is breaking our brains,
00:01:32.000 you know, this kind of shit.
00:01:33.000 And I sent it to my friend Hamish who's really a writer and he told me, like anybody can complain about this stuff.
00:01:41.000 You're not as original as you think.
00:01:42.000 All of my friends who are writers know all of this stuff.
00:01:45.000 The more interesting question is, if all of this is true, what could you do about it?
00:01:51.000 And that turned into Substack.
00:01:54.000 And what year was this?
00:01:56.000 2017. It's really perfect timing for when everything started getting really heavy in terms of censorship and also the chaos that came about because of the pandemic and journalists getting canceled and there was so much weird stuff in terms of what you were allowed to write about or not allowed to write about.
00:02:17.000 And then, of course, the Hunter Biden thing, the laptop, all that stuff came about in the first few years.
00:02:23.000 A lot of the best writers in the world, in my estimation, were getting kind of tissue rejected from the places where they would have been before.
00:02:32.000 Tissue rejected?
00:02:34.000 It's an analogy.
00:02:36.000 Like an organ transplant that fails kind of thing.
00:02:38.000 They're getting sort of pushed out from the places that would have been their home and where they could have done the thing that mattered to them before.
00:02:47.000 What happened?
00:02:50.000 What steps fell into place that caused all this?
00:02:55.000 My theory on this is that it's a combination of natural human affairs, right?
00:03:03.000 Like there's human nature, people act in certain ways, there's dark tendencies that come out when you get people together at scale, colliding with the consequences of the first generation of the internet revolution, basically.
00:03:16.000 The way that the first generation of the internet played out was this massive land grab for human attention.
00:03:21.000 So first of all, the computer and then even more so the smartphone kind of gobbled up all of the slices of people's lives that were just sitting there.
00:03:30.000 People used to get bored and then the smartphone came along and that just didn't exist anymore.
00:03:34.000 And in that phase, the things that won were the things that were the most efficient at...
00:03:39.000 Gobbling up everyone's attention.
00:03:42.000 And so you had this sort of...
00:03:44.000 The game that everyone played was like, get everyone's eyeballs.
00:03:48.000 And the things that you do to win at that game create an incentive landscape that drives everyone crazy.
00:03:55.000 Yeah.
00:03:55.000 The way to win at that game is be outraged or get people outraged.
00:04:00.000 Yeah.
00:04:01.000 The way to win at Twitter is be bad in a lot of ways.
00:04:03.000 And if you don't want to do it, somebody else will.
00:04:06.000 Be bad.
00:04:08.000 Well, I mean, bad in some, like, be outrageous, be...
00:04:11.000 The ultimate tweet, as I've found out myself sometimes, is not the thing that everyone agrees with or even the thing that everyone hates.
00:04:20.000 It's the thing that maximally divides people.
00:04:22.000 The thing that most separates the people that are in your tribe on your side and makes them kind of, like, cheer and at the same time spits in the face of the other people.
00:04:32.000 That is the recipe for a successful tweet because...
00:04:37.000 That's the incentive landscape that makes Twitter succeed.
00:04:40.000 Yeah, it's just I go on Twitter once a day, maybe twice a day just to see what kind of shit the monkeys are throwing at each other.
00:04:49.000 It seems like a mental institution sometimes.
00:04:53.000 I see people arguing over things and things that are trending that have zero impact in my life, and I don't understand why people are putting so much attention to it, but it seems like The recreational outrage that comes about because of Twitter is one of the most addictive things I've ever witnessed people take part in.
00:05:14.000 I mean, I say people.
00:05:15.000 I took part in it a little bit for a while, but now I don't engage at all.
00:05:19.000 Literally, I don't read my mentions.
00:05:21.000 I occasionally post things, and then I just get the fuck out of there.
00:05:24.000 I just think it's just too poisoned.
00:05:30.000 Yeah.
00:05:32.000 You're a wiser man than most.
00:05:34.000 Well, I just see it.
00:05:35.000 I see it in other people.
00:05:36.000 I see what it does to people.
00:05:41.000 It's very strange, because I never thought Twitter was going to become that.
00:05:44.000 I always thought Twitter was just some innocuous thing.
00:05:47.000 When it first came around, it was silly.
00:05:49.000 A lot of comedians loved it, because it was a great little...
00:05:53.000 Because in the beginning it was only 140 characters, it's great to keep your jokes succinct and short little blurbs and try to find fun, funny things to say.
00:06:03.000 But then it just became some strange way for people to expose their mental illness.
00:06:09.000 Yeah.
00:06:09.000 And none of that stuff is new.
00:06:11.000 None of the bad things that people do on social media are a new facet of humanity.
00:06:16.000 It's just, it amplifies it.
00:06:18.000 And it creates this, like, false reality that everyone sees that slowly drives us crazy.
00:06:23.000 So how difficult was it to A, start Substack, and then B, get journalists to come on board?
00:06:31.000 The hardest part of starting Substack was convincing ourselves that it could work.
00:06:36.000 Because it started as I was literally writing this essay.
00:06:38.000 And Hamish and I were talking, and we just came across this idea of, like, what if we let writers go independent themselves?
00:06:45.000 What if we let you start your own thing, you get the email addresses, you own everything, people can pay you directly, now you're getting hired and fired by your readers.
00:06:53.000 It sounded too simple to possibly work.
00:06:55.000 We're like, if this thing could work, somebody would have done this already.
00:06:58.000 It seems stupid.
00:06:59.000 But we kind of talked each other into it.
00:07:01.000 And, you know, I'm a tech nerd.
00:07:03.000 I'm a product guy.
00:07:04.000 Hamish is not that.
00:07:05.000 He's a writer.
00:07:06.000 He knows that world.
00:07:07.000 And we kind of both thought that it could work.
00:07:10.000 And so we just sort of, like, slowly talked each other into it.
00:07:13.000 He had a friend who was a writer who, like, needed it right away, basically.
00:07:18.000 Had wanted something like this and became our first customer.
00:07:21.000 A guy called Bill Bishop writes Cynicism.
00:07:23.000 It's a newsletter about China that everybody in business and government reads.
00:07:26.000 Why did he need it right away?
00:07:27.000 Well, I mean, so he'd had this newsletter that he'd been writing for free and paying for the privilege of sending that was just like, what the hell is actually going on in China for anybody who needs to actually know?
00:07:40.000 And, you know, lots of business people, government people all over the world would read it.
00:07:44.000 And he's been like, instead of paying to send this thing out, I should charge people for this, obviously.
00:07:50.000 But I couldn't figure out how to wire up the payment with the sending.
00:07:56.000 You just needed someone to handle the details of it.
00:07:59.000 And we were like, great, we'll do that for you.
00:08:01.000 We'll do everything for you except the hard part.
00:08:05.000 So you got him, and how did you get the word out?
00:08:09.000 How did it start to really become a player?
00:08:14.000 A lot of it was we started with Hamish's friends, like people who he knew.
00:08:19.000 And we would just go and talk to them.
00:08:21.000 And especially early on, a lot of it was just telling people about why we were doing this thing and what we thought was wrong and how this fairly simple platform we were building could help.
00:08:36.000 And if people...
00:08:39.000 If they believed in the things we were saying, then they would think, oh, maybe I'll try this.
00:08:44.000 And it really started with just great writers that Hamish knew or the people that we brought on knew.
00:08:49.000 And we just were like, here's what we're doing and why.
00:08:52.000 Do you guys get resistance?
00:08:54.000 Because I know there were some people that were writing bad things about Substack or saying that Substack is a bad idea.
00:09:04.000 What was the argument?
00:09:06.000 There's been a few.
00:09:08.000 I mean, one that comes up a lot is...
00:09:11.000 There's been a few generations of it.
00:09:13.000 At the very start, you were asking how we started.
00:09:15.000 The argument was like, no one's going to pay for anything, you idiots.
00:09:18.000 Right.
00:09:18.000 It's like, you know, writers on the internet, social media's bad.
00:09:21.000 Yeah, all sounds good theoretically, but I'm never going to pay for anything.
00:09:25.000 Never going to work.
00:09:26.000 Good luck.
00:09:28.000 And I had this parlor trick where I'd run on people.
00:09:31.000 I'd just be like, well, who's your favorite writer?
00:09:33.000 After they just told me they would never pay for a writer, I'm like, who's your favorite writer?
00:09:36.000 They'd say, ah, it's so-and-so.
00:09:37.000 I'd be like, would you pay five bucks a month to, like, get their stuff directly?
00:09:41.000 And they'd be like, yeah, I probably would, but that's different.
00:09:44.000 It's that person.
00:09:44.000 It's this thing.
00:09:46.000 And so there was this weird thing where nobody thought it would work in the abstract, but it worked once you had something that you cared about.
00:09:52.000 So we kind of crossed the, like, it's never going to work thing, and then immediately got into the it's working and it's bad time for a bunch of things.
00:10:01.000 Probably the most prominent of which is we started with this really strong commitment to free speech.
00:10:06.000 If we think that we're making a platform for writers that is...
00:10:10.000 You know, can be a positive force in our intellectual climate.
00:10:14.000 We just think that's table stakes.
00:10:16.000 That's something that's an important principle.
00:10:18.000 And we came up in a time that not everyone believes in that at all.
00:10:23.000 We took a lot of shit for a bunch of different times for, well, why do you let this person send emails to people that want to get it from them?
00:10:31.000 Specifically, can you say what writers were problematic?
00:10:36.000 Or would you like to avoid that?
00:10:38.000 We could avoid that.
00:10:39.000 I don't want to put anyone on blast.
00:10:40.000 Okay, let's just talk about, like, subjects.
00:10:42.000 Like, what would you do?
00:10:43.000 Like, here's a question.
00:10:44.000 What if you had, I mean, I think Substack is, like, how many people do you have on it right now?
00:10:49.000 How many people?
00:10:50.000 In terms of writers.
00:10:52.000 How many people?
00:10:52.000 There's, like, tens of thousands.
00:10:54.000 Wow.
00:10:55.000 Pretty cool.
00:10:56.000 It's going.
00:10:57.000 Yeah, it's wild.
00:10:57.000 That happened so quickly.
00:10:59.000 What if you got, like, a Holocaust denier that starts publishing stuff on Substack?
00:11:05.000 So we have a terms of service that we set out that has a couple of really strict, really tightly construed things.
00:11:13.000 Most of it's like, you can't spam, you can't do these things.
00:11:16.000 We do have a couple of things.
00:11:18.000 There's no porn.
00:11:20.000 You're not allowed to advocate for literal violence.
00:11:23.000 There's a few things that are sort of just like bright lines that are intended to be Kind of like a really high bar and allow for space where there's a lot of shit on Substack that we ourselves disagree with and find awful.
00:11:38.000 We think that the old school ACLU approach on this is correct, where they're protesting to help the Nazis have their free speech rights.
00:11:48.000 Not because we think those things are good, but because we think that airing them is more valuable and in the long run better than I'm trying to solve the problem I'm censoring them.
00:11:58.000 So there are people that are on Substack that everybody sort of agrees are gross?
00:12:04.000 I mean, I don't know if there's anybody that everybody agrees are gross.
00:12:07.000 But for any individual, I think anybody that exists could find someone on Substack that they think is the greatest thing ever, and they could find someone on Substack that they think is terrible.
00:12:16.000 And we take that as a sign that we're doing it right.
00:12:19.000 So, no porn, what are the other ones?
00:12:24.000 I'd have to look it up.
00:12:27.000 Yeah, I think there's a doxing thing.
00:12:29.000 There's, like, advocating violence.
00:12:31.000 There's a few, like, kind of things that just break the edges.
00:12:34.000 Do you allow erotica?
00:12:35.000 Like, if someone published, like, Bigfoot porn, do you ever read those?
00:12:40.000 Our genius take on this is that we disallow porn, but we allow erotica.
00:12:44.000 And it turns out that's, like, a non-trivial thing.
00:12:47.000 But the intention is, like, look, there's already OnlyFans.
00:12:50.000 If you're just doing...
00:12:52.000 That.
00:12:52.000 There's a place for you.
00:12:55.000 And we're not...
00:12:56.000 I don't even think...
00:12:58.000 I don't have a problem with that.
00:12:59.000 I don't think that's wrong.
00:12:59.000 But that's just not the thing that we're trying to serve.
00:13:03.000 Right.
00:13:03.000 But people are allowed to do that.
00:13:06.000 They're allowed to write...
00:13:08.000 Yeah.
00:13:08.000 Yeah.
00:13:09.000 And so when it comes to like controversial things, like I'm sure a big controversy, I know Alex Berenson was a controversy because he was publishing a lot of negative studies and things on COVID vaccines that a lot of people didn't want him to talk about.
00:13:30.000 And this is how he got kicked off Twitter.
00:13:32.000 This is how he wound up suing Twitter and actually winning and getting back on Twitter, which is pretty fucking crazy.
00:13:38.000 I could not believe they let him back on.
00:13:39.000 Well, he was right.
00:13:41.000 The problem is he's citing scientific studies, you know, from other countries that they don't like what the data represents.
00:13:50.000 And there was also the CDC study where they were talking about boosters for 18 to 49 people and they didn't want to release the data because they felt it would contribute to vaccine hesitancy.
00:14:00.000 And he's like, what the fuck are you doing?
00:14:03.000 And that kind of stuff, like publishing stuff that makes people uncomfortable but that is actually accurate, it's a big part of journalism.
00:14:12.000 And he wasn't able to do that in these public forums like Twitter.
00:14:19.000 And I would go a step further.
00:14:21.000 I mean, I think publishing things that are true but uncomfortable is obviously journalism, and the value to that is obvious.
00:14:28.000 And history is exceedingly clear that you can't always sort out at the time, which is which.
00:14:34.000 And the thing that that leads us to is that even things that aren't, like, we don't want to be in the business of trying to adjudicate what's true.
00:14:43.000 And so we don't even have...
00:14:44.000 We're not like, well, you can publish things that are true that make people mad.
00:14:46.000 We're kind of like, look, this is...
00:14:48.000 You can publish things.
00:14:49.000 This is your thing.
00:14:50.000 People are trusting you.
00:14:51.000 You're not subscribing to Substack.
00:14:52.000 You're subscribing to some writer.
00:14:55.000 And we think that it's better to allow the stuff to be, like, to have people have a platform and have freedom of speech and let that stuff get sorted out because...
00:15:07.000 All of the alternatives that sound really good end up in disaster.
00:15:12.000 Yeah.
00:15:13.000 No, I couldn't agree more.
00:15:14.000 I mean, a good, funny example is Trump's Truth Social, right?
00:15:21.000 So he puts out this new social media platform, Truth Social, and then they start censoring people who talk about January 6th.
00:15:31.000 Like, apparently, they're...
00:15:33.000 Like, let me talk about it in a negative way.
00:15:35.000 If they're like, January 6th might not have been great, they're like, that's your band.
00:15:38.000 Well, I don't think it's that simple.
00:15:40.000 I think when they start talking about people who either incited or were saying things or...
00:15:49.000 Let's find out.
00:15:51.000 Because Truth Social does censor things on the January 6th committee and all the investigations that are currently underway.
00:16:01.000 I don't think they want to shine a light on the fact that not only did that happen, but there's some really troubling things about a lot of people that were involved in it.
00:16:13.000 As soon as you get to the place where you think your job as a platform, as somebody that's making the things where people are publishing their ideas, is not to let people publish what they want and let the market sort it out,
00:16:30.000 but instead to Push some narrative, even if the narrative is right, even if you're like, you know, trying to push something that's unambiguously good and true, by trying to publish it through censorship and through forced conformity,
00:16:45.000 you end up doing more harm than good, at least we believe.
00:16:49.000 Do you think that social media was the driving force for these ideologically driven journalists now?
00:16:59.000 Instead of being a journalist that reports uncomfortable truths even if they don't agree, even if the side that they're supposedly on, whether they're conservative or liberal, whatever those uncomfortable truths are, that fly in the face of whatever the narrative is that that side is pushing.
00:17:17.000 Is it social media in the echo chambers and worried about the blowback from either followers or other journalists?
00:17:27.000 What caused that?
00:17:30.000 I think that's definitely part of it.
00:17:32.000 And I think, you know, it's not a new thing, right?
00:17:35.000 If you look at every age, people who are saying things that are true and are uncomfortable to some dominant narrative, that's always, like, people always take flack.
00:17:46.000 There's always efforts to censor them.
00:17:48.000 You can go back and look at, you know, people who are accused of being copyrighted.
00:17:51.000 Yeah.
00:18:07.000 Displaying being very upset can create a false sense of consensus.
00:18:15.000 Twitter and some of these platforms, you can get even like 30 or 40 people that are just really mad, can make it look like the whole world is coming down around you and that everybody hates you and wants to burn you as a witch.
00:18:31.000 Even if that's not actually true, even if it's just a small fraction of people feel that way.
00:18:36.000 And if either you aren't strong enough to deal with that, or more likely if you're part of an institution that doesn't have the fortitude and the principles to push back against that, that's where I think that force can cause things to crumble in a way they ought not to.
00:18:53.000 And that is what we're seeing.
00:18:55.000 Yeah.
00:18:55.000 And it is, well, it's an increasingly growing number of people, but it was relatively small initially.
00:19:02.000 And now it seems like there's these mobs of people that hop on any narrative to try to enforce, like, it's almost like they're just trying to get a win for the team.
00:19:14.000 It seems very strange that that's taking place in journalism because it's always disturbing to me that people don't remember the lessons of the past and we have to keep making the same mistakes over and over again.
00:19:29.000 Like the example that you used at the ACLU, how they were literally defending Nazis and their right to free speech because their perspective was free speech should be an absolute thing.
00:19:39.000 And the correct response to that is not to censor these people.
00:19:46.000 It's to correct them with correct speech.
00:19:50.000 The optimistic take on this is that every generation has to learn this for themselves.
00:19:55.000 People forget.
00:19:57.000 The lessons from the past don't feel real until you've lived through it.
00:20:02.000 But that once we live through it, People will understand.
00:20:06.000 I think people are starting to get a new appreciation for why free speech is a principle that matters.
00:20:15.000 I think if you come up taking that for granted and living in a world that you enjoy all the benefits of that without ever having to really think about it, you can forget why it matters.
00:20:25.000 And then as soon as that comes for you, it flips it back on.
00:20:30.000 I think we'll see if people will come in the other direction.
00:20:33.000 I think people are starting to turn in that direction now and I think that's it speaks to the success of Substack that people are recognizing that you really do have to have some sort of a forum where someone can speak their mind and not have I mean your fears of criticism I mean that's not that's not the issue the issue is being deplatformed where you can't express yourself anymore because whatever you're saying troubles people Right.
00:21:02.000 And then the consensus reality is that these viewpoints, nobody is saying these things, when in fact there are people that would be saying them if they hadn't been kicked out of the thing.
00:21:15.000 That has been one of the things that we've...
00:21:28.000 I think we're good to go.
00:21:41.000 Quality versus just time and entertainment has been eroded because of the platforms that have taken over.
00:21:49.000 And these two things go hand in hand.
00:21:51.000 But having people read things that are smart and good by somebody whose incentive is to earn and keep people's trust, even if it's not something they're going to get canceled for, that stuff doesn't always exist unless you have a model that supports it.
00:22:09.000 And with Substack, do you have an algorithm, like say if you enjoy Barry Weiss's work, I recommend this?
00:22:18.000 So the way we do this is very thoughtful because on the one hand, we want to have a network effect for the platform, right?
00:22:25.000 We want it to be true that when you come to Substack, you know, yes, it's a great tool.
00:22:29.000 Yes, it's all these good things.
00:22:30.000 It's free until you take money and then we charge you 10%.
00:22:34.000 But we also want it to be like, you're going to grow, right?
00:22:36.000 You're going to find people who would love your stuff are going to be able to find it.
00:22:40.000 And by being here, you get more benefit than we're asking in return.
00:22:43.000 On the flip side, all of the obvious ways that you would do that, if we were to copy the way that Twitter does this or the way that YouTube does this, We would just be recreating some of the things that we kind of like set out to fight against.
00:22:57.000 And so as we build those features, we do have a thing that introduces you to recommended writers.
00:23:01.000 The difference is it's not Substack or Substack's algorithm that's recommending.
00:23:06.000 It's the writer that you already subscribe to that's recommending.
00:23:09.000 Oh.
00:23:10.000 Well, I like how Barry uses her page because she uses her page to promote other journalists and other writers.
00:23:16.000 Yeah.
00:23:16.000 Yeah, and we also have that.
00:23:17.000 She has the guest post, and she's sort of like, the same way that people coming on here can be like a career-making thing, like people going on Common Sense can be like a major turning point, and she can bring somebody into the world that the world needs.
00:23:30.000 When I say the pushback against Substack, have there been major critical articles written about Substack?
00:23:37.000 Yeah, I mean, we've had a few.
00:23:39.000 There's been a series of fun ones from the New York Times.
00:23:42.000 There was one that was like, is the substack economy bad for democracy?
00:23:47.000 Which we thought was good.
00:23:48.000 There's been a few critical ones.
00:23:51.000 What was their argument?
00:23:52.000 How could you be bad for democracy?
00:23:55.000 First of all, what a loaded thing to say.
00:23:57.000 Is it bad for democracy?
00:23:59.000 Is it the end of human civilization?
00:24:01.000 Will Substack kill us all?
00:24:03.000 Yeah, that's it.
00:24:05.000 Subscribe for the next year to find out.
00:24:06.000 That's the boiled down headline.
00:24:10.000 You know, like, YouTubers do that.
00:24:12.000 They'll have, like, a title to the video that's just pure clickbait with a crazy photo on it.
00:24:17.000 You know, Betteridge's Law of Headlines, which is if the headline asks a question, the answer is just always no.
00:24:25.000 Right.
00:24:26.000 UFOs, are they real and among us today?
00:24:28.000 No.
00:24:28.000 Probably.
00:24:29.000 Maybe.
00:24:31.000 Yeah, maybe I picked the wrong example there.
00:24:33.000 Maybe in this room.
00:24:34.000 I mean, look, we knew what we were getting ourselves into when we set out to build this thing, right?
00:24:39.000 We're like, hey, we're making something that is aimed at the heart of the culture war.
00:24:44.000 We're making something that we think can make some small positive difference in the forces that are tearing things apart and breaking things down.
00:24:55.000 And if we are successful at that, there's no world where we get to do that where we don't have to take a lot of heat.
00:25:01.000 Yes.
00:25:02.000 You know, we're going to make people—there's no way to do that without making the people in the existing places mad.
00:25:08.000 And I think that doesn't mean that we shouldn't listen to criticism.
00:25:11.000 That doesn't mean if we're wrong about stuff and people point it out to us, we should, like, cover our ears.
00:25:16.000 But the correct amount of skeptical press and skeptical attention that Substack gets is never going to be zero.
00:25:24.000 Yeah.
00:25:24.000 Yeah, no, I think it's good.
00:25:26.000 I mean, I think in a way they're advertising for you, for sure.
00:25:30.000 I mean, not in a way.
00:25:32.000 I mean, they're definitely doing it.
00:25:33.000 What they're doing by complaining about you is sort of the same thing that mainstream media does about podcasts.
00:25:39.000 They don't understand what they're doing.
00:25:40.000 That's how Trump got elected.
00:25:41.000 Yeah, exactly how Trump got elected.
00:25:44.000 They were talking about him constantly and it gave him press.
00:25:48.000 And the thing about what's happening with Substack that parallels with mainstream media and podcasting is that they're bringing about their own demise by their very format.
00:26:02.000 But what they're doing is sort of highlighting the strengths of what you guys have been able to accomplish.
00:26:08.000 And one of the parallels in podcasting is, you know, a show like CNN is never going to be able to truly compete with a show like Breaking Points.
00:26:19.000 Breaking Points on YouTube with Sagar and Crystal Ball.
00:26:22.000 The reason why they're not going to compete is that, first of all, they're captured by whether it's executives or the corporations that run them.
00:26:30.000 They're not independent.
00:26:31.000 And so they have to have a slant.
00:26:34.000 On whatever particular thing that's in the news that whatever interests need them to have a slant on, brought to you by Pfizer.
00:26:45.000 We see these things over and over and over again in mainstream media to the point where people have lost their faith that this is objective journalism.
00:26:54.000 And so these other shows thrive, like the Jimmy Dore show and all these shows that highlight real problems.
00:27:00.000 And the real journalists of the world, like guys like Glenn Greenwald and Matt Taibbi, like those people flock to them now.
00:27:09.000 Like, surely there's got to be a rational take on this.
00:27:11.000 Where the fuck do I get it?
00:27:12.000 And then, boom, Substack is there.
00:27:16.000 Yeah.
00:27:18.000 I think that's right.
00:27:19.000 I mean, isn't your show ten times bigger than the biggest CNN show?
00:27:24.000 Supposedly.
00:27:25.000 I'm skeptical that it's only ten times bigger.
00:27:27.000 I can't imagine anybody's really watching that.
00:27:30.000 Who's watching this stuff?
00:27:31.000 It's all like TV screens in airports.
00:27:33.000 It is a lot of that, but I think they've kind of eliminated it from airports.
00:27:36.000 Isn't that true recently?
00:27:38.000 Didn't they lose some sort of a deal with airports?
00:27:40.000 People got tired of being bummed out at the fucking airport.
00:27:45.000 I mean, the format sucks, period.
00:27:47.000 The format where you have a conversation for five minutes and then you have to let the person go because there's a commercial break.
00:27:54.000 And then right after that you have another thing scheduled.
00:27:56.000 It ensures that you're never going to get a real deep dive into something.
00:28:01.000 And we all know, as human beings, if you want to talk to someone about something, it takes time.
00:28:06.000 If you want to really find out what someone's opinions are, you have to have a conversation with them.
00:28:10.000 And you have to find out where they're coming from.
00:28:12.000 What's your perspective?
00:28:13.000 How did you grow up?
00:28:14.000 Where did you form these ideas?
00:28:16.000 What was a major turning point in your philosophy and the way you view things?
00:28:21.000 And you have to want that in the first place.
00:28:23.000 Yeah.
00:28:23.000 You have to just be curious what people actually think, rather than bring people on to be the avatar of some opinion you already want, like the theme you already want the show to be about.
00:28:33.000 Exactly.
00:28:34.000 And so, in many ways, it's sort of ensuring the success of Substack.
00:28:40.000 Because the system appears, if it's not broken, it's on a really shitty foundation.
00:28:48.000 Let's hope.
00:28:49.000 Yeah.
00:28:49.000 I don't think you have to hope.
00:28:52.000 Let's hope that it's good for Substack.
00:28:54.000 I'm not cheering for the demise of anything, really.
00:28:58.000 Oh, I'm not either.
00:29:00.000 I mean, I still read the New York Times and Washington Post and a bunch of other periodicals.
00:29:05.000 I think that it's important.
00:29:07.000 I think there's a lot of people that work for The Times and a lot of these other organizations that are much maligned that do great work.
00:29:13.000 They're great journalists, but I think they're captured in some ways by the institution that they work for.
00:29:21.000 And it's flooded with ideologically driven people and a very left-leaning ideologically driven populace.
00:29:29.000 This is not, like, there's not a lot of, like, very popular, very influential right-wing publications that compete with all the left-wing publications.
00:29:40.000 It's very unbalanced.
00:29:41.000 And it's not like the right-wing outlets that do exist are bastions of, you know, free speech and independent ideas and not having a narrative.
00:29:48.000 You mean OAN News is not awesome?
00:29:50.000 I can't believe they're not...
00:29:52.000 They're doing the truth social thing, right?
00:29:53.000 Where it's the same enforced narrative, enforced views, just from a different way.
00:29:59.000 And the same mechanisms in a way that breaks the thing.
00:30:01.000 And they've been...
00:30:02.000 You know, we talk about the way the internet changed all this stuff.
00:30:06.000 The internet changed all these businesses, too.
00:30:08.000 Because they exist in a world where they're competing for a slice of people's attention with Twitter, with TikTok, with YouTube, with everything.
00:30:17.000 So there's this like...
00:30:18.000 This game where you're trying to grab people at all costs is just a tough game for the truth.
00:30:28.000 Yeah.
00:30:29.000 It is a tough game for the truth, but isn't that sort of what creates the interest for your very business?
00:30:37.000 Like, that's kind of cool for you.
00:30:39.000 I mean, it really is like kind of a perfect storm in many ways.
00:30:44.000 The thing that I get the most excited about with Substack is Getting to hang out with my heroes a bunch.
00:30:52.000 Getting to meet people who are making things on Substack that I think matter for the world.
00:30:57.000 Who are some of the ones that you go to on a regular basis on Substack?
00:31:01.000 Who are some good ones?
00:31:03.000 You should have Ethan Strauss on here.
00:31:05.000 This guy does House of Strauss.
00:31:06.000 He's an ex-sports journalist.
00:31:08.000 Strauss?
00:31:08.000 Strauss.
00:31:09.000 Strauss.
00:31:10.000 S-T-R-A-U-S-S. This is just at the tip of my tongue.
00:31:15.000 He was a sports journalist and kind of left the...
00:31:20.000 For a bunch of these same things, we kind of got disillusioned with how things were going and just started a newsletter and a podcast on Substack.
00:31:26.000 And it's just fascinating.
00:31:27.000 I'm not even a sports guy.
00:31:28.000 I'm a computer nerd.
00:31:29.000 I don't follow sports.
00:31:32.000 But I read and listen to this guy, and it's just fascinating.
00:31:36.000 You guys have podcasts on Substack?
00:31:38.000 Yeah, we added a podcast thing.
00:31:39.000 So you can do...
00:31:40.000 A lot of writers want to like...
00:31:41.000 Interesting.
00:31:43.000 Do you have video?
00:31:44.000 Yeah.
00:31:46.000 Yeah, we had a video.
00:31:48.000 That's going pretty good, actually.
00:31:52.000 That's great.
00:31:53.000 Why do you ask?
00:31:54.000 Well, I mean, what I ask is I was very concerned with where podcasts were going.
00:31:59.000 I mean, Apple has been pretty cool.
00:32:01.000 They never gave us a hard time.
00:32:03.000 But YouTube, they gave us a hard time about a bunch of episodes, particularly during the pandemic when they didn't like having dissenting opinions and different scientists that had different perspectives.
00:32:16.000 There's one woman on Substack who's amazing, Emily Oster.
00:32:19.000 She wrote some books about pregnancy.
00:32:22.000 She's like an economist that writes about the real...
00:32:25.000 There's so many crazy myths around pregnancy and raising kids.
00:32:29.000 And she just writes the real stuff.
00:32:31.000 And she did some podcast episodes.
00:32:33.000 And a lot of it's like, my kid's in COVID. What do I do?
00:32:36.000 How much should I wear?
00:32:37.000 And she's super good and mainstream and sensible about all this stuff.
00:32:42.000 And I went to...
00:32:44.000 She does the podcast on Substack, but it's available on Spotify, too.
00:32:47.000 I went to Spotify, and half the page has these warnings they put on there that's like, look out!
00:32:53.000 If you read this, don't believe anything that's in here.
00:32:56.000 Here's the stuff you should believe, and it's links to the official government sources and a few news things.
00:33:02.000 And it's...
00:33:03.000 I understand why it's there.
00:33:04.000 I understand why people feel like having accurate information matters.
00:33:09.000 I don't think that they're coming from an evil place wanting to do that.
00:33:15.000 But it makes me queasy.
00:33:16.000 I don't think that it's the right...
00:33:18.000 I worry that we're losing our minds on that stuff, I guess.
00:33:21.000 Yeah.
00:33:22.000 Well, the problem is oftentimes they're wrong.
00:33:25.000 And a lot of that information that they say is disinformation or misinformation turns out to actually be accurate.
00:33:32.000 And the place where you're getting misinformation is the mainstream reporting.
00:33:37.000 What's true is often very hard to sort out in advance, especially when something's developing rapidly, things are changing.
00:33:45.000 It's the idea that, you know, you can have an official source that can just adjudicate in real time what's real and what's not is a fiction, right?
00:33:55.000 And I think everybody, when pressed, would admit that.
00:33:58.000 Nobody thinks that there's one authority you could go to and say, yes, they're going to have exactly all the answers.
00:34:03.000 And yet, when you get into the regime of saying, well, who's allowed to talk about this?
00:34:08.000 You know, this is the thing people fall back to.
00:34:10.000 Like, well, we'll just see who the official things say are good.
00:34:13.000 And it's even more complicated than that because individual people, the kind of thing that makes people interesting, and this is true in, like, science and technology as well, is the kind of, like, personality that makes you someone that can do exceptional things can also lead you to do crazy things.
00:34:32.000 I like the example of Isaac Newton, who co-invented calculus, invented a lot of the physics that engineers use to this day, and it turned out that he had a side hustle of weird Bible conspiracy theories slash alchemy,
00:34:51.000 where he had this thing where he was basically a lunatic, like a conspiracy theory guy.
00:34:55.000 And he did both.
00:34:57.000 He had a crazy...
00:35:00.000 Lunatic hobby and co-invented calculus and modern physics.
00:35:03.000 And if you look at him and said, and this is a thing that could have happened at the time, and said, well, this is heresy.
00:35:09.000 Like, this one thing you're doing is crazy.
00:35:11.000 Like, you're canceled.
00:35:13.000 You miss out on something.
00:35:15.000 Yeah, for sure.
00:35:16.000 And Isaac Newton, didn't he also die a virgin?
00:35:20.000 I don't know.
00:35:21.000 I think he did.
00:35:22.000 I think there was a thing about...
00:35:27.000 I just think he was like averse to human contact, which, I mean, he might have been on the spectrum, right?
00:35:34.000 I mean, if you really think about it, the type of person that could develop, you know, a theory of gravity.
00:35:42.000 What year was that?
00:35:44.000 I don't know.
00:35:45.000 I mean, what the fuck did they know?
00:35:46.000 I mean, the alchemy, he probably thought he was on to something.
00:35:49.000 Yeah, I mean, both of these probably seemed promising to him.
00:35:52.000 Like, to this guy that's smart enough to co-invent calculus, was like, alright, I've got two things going here.
00:35:58.000 They're both pretty interesting.
00:36:00.000 If one of these pans out, I'm sad.
00:36:02.000 Yeah, that is a thing about...
00:36:05.000 Look, I'm a giant Hunter S. Thompson fan.
00:36:08.000 Huge.
00:36:09.000 But he was a terrible person in a lot of ways.
00:36:12.000 I mean, a lot of the stuff he did and a lot of the stuff he said, he was just constantly snorting coke and drinking and screaming at people and throwing things at them.
00:36:23.000 There's a famous video of him having a gunfight with his neighbor.
00:36:27.000 Have you ever seen that video?
00:36:28.000 No.
00:36:29.000 Okay, find that.
00:36:30.000 Literally a gunfight?
00:36:30.000 Hunter S. Thompson has a gunfight with his neighbor.
00:36:32.000 Yeah, they're shooting at each other.
00:36:33.000 Yeah, but not only that, but there's cameras there.
00:36:37.000 Hunter's leaning around a corner and shooting at his neighbor.
00:36:39.000 This is like Colorado in the 1970s.
00:36:43.000 Yeah, he was out of his fucking mind.
00:36:45.000 That's wild.
00:36:45.000 But, also, fucking brilliant.
00:36:49.000 Like, and some of the things that he wrote to this day, you know, I'll go over some of those passages, and they just...
00:36:56.000 The people who did this, uh, Declaration of Independence and the Constitution, were, uh, good people.
00:37:11.000 It's a good place.
00:37:12.000 Here we are in the middle of it, up on the mountain.
00:37:14.000 If this son of a bitch wants to bitch about his cows over here and shoot at me, well, it's our country.
00:37:21.000 It's not theirs.
00:37:22.000 It's not a bunch of used car dealers from Southern California.
00:37:26.000 Democracy, you have to be a player.
00:37:30.000 I wish this guy was around to be on Substack.
00:37:33.000 He probably wouldn't have wrote much, unfortunately.
00:37:36.000 He was just too busy getting drunk and high at the end of his life.
00:37:39.000 You could have caught him, you know, if you caught him in like 1966, 67, it would have been fun.
00:37:44.000 You know, I think he would have been perfect for Substack.
00:37:48.000 I think Matt Taibbi, who's big on Substack, had the same job at the Rolling Stone.
00:37:51.000 Yes.
00:37:52.000 Yeah, he did.
00:37:53.000 I love Matt Taibbi.
00:37:56.000 I think there's room for a certain amount of chaos in individuals that have something to offer.
00:38:06.000 You just can't throw the baby out with the bathwater.
00:38:11.000 I think this is how you make progress.
00:38:13.000 If you insist on having a low variance and everything needs to be as safe and as good as possible, you might limit the downside or how wrong things are, but inevitably you also limit the upside.
00:38:26.000 If you prevent people from being You know, doing something dumb that's against the consensus, you always prevent them from doing something that's genius that's against the consensus.
00:38:37.000 And it's that thing, there's like asymmetric upside there.
00:38:40.000 It's that genius thing that moves the world forward.
00:38:43.000 And so if you cut it off, it breaks things.
00:38:47.000 And that's the argument for freedom, right?
00:38:49.000 And that's the argument for freedom of expression, freedom of speech and of thought, and the ability to be wrong, the ability to communicate in a way where you don't have to jump through hoops to get your thoughts out.
00:39:09.000 And, you know, I think there's pros and cons that come with that, right?
00:39:13.000 You know, you're going to have some people write things that, well, boy, it would be nice if we had an editor to go over this.
00:39:20.000 But it's also really nice when I know there's not an editor.
00:39:23.000 When I love reading, like, Barry's writing, that I know that's coming from her.
00:39:28.000 This is not fucked with.
00:39:30.000 Because I've talked to Barry about pieces that she wrote where, I mean, there's a thing she wrote about me for the New York Times where they had a change, they wouldn't let her say something.
00:39:38.000 Or they add a headline that's like subtly different than the thing.
00:39:42.000 So editors can be good though, right?
00:39:44.000 There's people on Substack that have editors.
00:39:46.000 The difference is the writer hires the editor.
00:39:48.000 Yes, yes.
00:39:49.000 Well, it's a collaborative venture.
00:39:51.000 It's a collaborative venture and ultimately the person that you're subscribing to, the person that you're choosing to trust is the person that you're hearing from.
00:39:57.000 Oh, yeah.
00:39:58.000 I mean, I'm not opposed to other people's contributions.
00:40:01.000 I mean, with stand-up comedy, it's critical.
00:40:03.000 We rely on the audience for their contribution, and we also rely on other comedians.
00:40:08.000 Like, someone will say, I think if you said it this way, it'll work better.
00:40:10.000 And they're like, ooh, you're right.
00:40:12.000 Ooh, thank you.
00:40:13.000 You know, it's like, you need other perspectives.
00:40:15.000 But you also, you know, there's, I mean, I guess this is what you get from podcasts.
00:40:20.000 This is one of the things that I'm so fascinated about Substack is that I find real parallels with the way Substack is dealing with journalism versus the way podcasts are dealing with free-form conversations.
00:40:33.000 And there's a lot of similarities in this world where it's like, okay, the attention monster social media things are taking over everyone's attention.
00:40:41.000 Everyone's got their phone now.
00:40:42.000 People don't go to websites anymore.
00:40:44.000 Everything has to be in one of the apps on your home screen to be in your life.
00:40:50.000 And it turns out that most of the apps on your home screen are controlled by one of these algorithms that's kind of working against you to just grab as much of your time as possible, with a few exceptions.
00:40:59.000 One is the podcast app, where it's using this RSS format where you subscribe to things.
00:41:03.000 And then those things from the people you subscribe to show up.
00:41:07.000 And you have this unmediated connection where you can actually choose who you want to spend your mind and life with.
00:41:14.000 And another one is the email app, where people can send you emails.
00:41:19.000 And those last sort of like bastions of direct connection between people that are making things and people that care about them is the source of a lot of the power of the model, I would say.
00:41:32.000 This thing where you're, you know, I'm subscribing to you on Substack, I'm listening to your podcast because I trust you to curate a slice of my intellectual life for me.
00:41:42.000 If what I read, what I listen to is who I am, you're one of the people I want shaping who I am, that's a big investment.
00:41:49.000 We shouldn't be handing that off to what Twitter thinks will make me mad.
00:41:53.000 Yeah, for sure.
00:41:54.000 What do you do when you guys have meetings and you look at how do we make this better?
00:42:01.000 What are our problems?
00:42:03.000 What are the dilemmas that you guys encounter?
00:42:08.000 So the big thing we've done that's good is we picked a business model that aligns us with the people on the platform, right?
00:42:16.000 So it's free to publish.
00:42:18.000 Once you start charging money, we take 10%.
00:42:20.000 So if you're a writer, when you make more money, that's how Substack makes money.
00:42:24.000 When you're a reader, when you find stuff that's valuable enough that you actually want to choose to pay for it, that's also how Substack makes money.
00:42:31.000 And so that sort of guides us in the things we want to build.
00:42:35.000 It's like, hey, we want to Do the things that help writers, which are all the things that help readers, which are also the things that help Substack.
00:42:41.000 And the dilemmas end up being, like, okay, how do we do that?
00:42:44.000 And how do we do that in ways that don't erode the fundamental value that we're creating?
00:42:51.000 Right?
00:42:52.000 Because there's lots of sort of, like, short-term things that we could do that would seem like really great ideas.
00:42:57.000 Like, why don't we just...
00:42:59.000 I think?
00:43:22.000 The power that all of these social media platforms have harnessed, but do it in a way that puts the people in charge, puts the writers and the readers in charge.
00:43:30.000 There's not really a blueprint for that because that hasn't existed, I don't think, fully until now.
00:43:36.000 What do you do other than an algorithm?
00:43:39.000 And people are terrified of algorithms because they've seen the effect that it's had on Twitter and YouTube.
00:43:46.000 It's unfortunate, but it does sort of highlight the worst instincts in human nature in terms of accumulating information.
00:43:54.000 You go towards things that are outrage-oriented.
00:43:58.000 Yeah.
00:43:59.000 The algorithms, it's a bit of a misnomer, too.
00:44:02.000 I'm a software guy and a nerdy by the stuff, and it's sort of like everything's an algorithm.
00:44:06.000 It's like, are you using an algorithm?
00:44:08.000 Are you using electricity?
00:44:08.000 Like, of course you are.
00:44:09.000 When we say algorithm, we mean, like, something that is showing you stuff in a way to achieve some goal that it has that might or might not be your goal.
00:44:23.000 And so I think the way to think about it is not like, do you have an algorithm or not?
00:44:27.000 But it's like, what is that algorithm trying to do?
00:44:30.000 Right?
00:44:30.000 If the algorithm is trying to get you to use TikTok for as long as possible every day, that's going to have a different consequence than an algorithm that's trying to introduce you to a writer that you trust enough that you might want to pay for them and care about them.
00:44:46.000 Right, but how do you do that?
00:44:48.000 How do you find an algorithm that's going to introduce you to someone that you would think would be interesting based on who you already think is interesting, other than creating an echo chamber?
00:44:58.000 These are the exciting problems we get to solve.
00:45:00.000 I'll tell you, some of the stuff that's working really well so far is this principle of putting the writers in charge and putting the readers in charge.
00:45:08.000 So we added a recommendations feature, and rather than say we're going to Figure out who you want to do.
00:45:15.000 We let the writers pick.
00:45:17.000 And you don't have to pick anyone.
00:45:18.000 You can say, I'm not going to, you know, people come to me, I'm not going to send anybody anywhere.
00:45:22.000 Or you can say, hey, if you're coming to me, one of the things that I can do for you is put you on to other things that I think are interesting, that I think are worthwhile.
00:45:30.000 And I'm sort of putting my name on that as something that you would check out.
00:45:33.000 Now, that's going to be less efficient if you just look at the numbers of how much engagement does that get.
00:45:38.000 It's going to be impossible to build something that is as efficient as the YouTube page that's like, I know what you want better than you do yourself.
00:45:46.000 But as a reader, I'm going to choose to spend my time on Substack Around that stuff because it creates a real alternative.
00:45:54.000 Because I know that I'm not giving my mind to something that's kind of operating against me.
00:46:00.000 And I know that if I'm seeing something, there's like a human being that made that decision.
00:46:06.000 And I know who they are.
00:46:07.000 And it's sort of like about that trusted relationship rather than the algorithm, the scary thing.
00:46:13.000 Are you Canadian?
00:46:14.000 I am.
00:46:14.000 Thank you for noticing.
00:46:15.000 Snuck it out.
00:46:16.000 A boat.
00:46:16.000 I heard it, buddy.
00:46:18.000 You only had one.
00:46:18.000 It's been like slowly disappearing.
00:46:20.000 Yeah.
00:46:21.000 There was a period of time when I first moved where my ears accommodated before my voice, and so I sounded like I had a funny accent to myself, which was very unsettling.
00:46:31.000 I feel you, because I had a Boston accent for a while, and then I heard myself on television.
00:46:36.000 I was like, ooh.
00:46:37.000 Can you still do the Boston accent?
00:46:38.000 Oh, yeah, if I get drunk.
00:46:41.000 Especially if I'm drunk with my friends from high school.
00:46:44.000 It'll come out, because they have it hard.
00:46:46.000 They still live there.
00:46:48.000 Are you a public company?
00:46:50.000 No.
00:46:50.000 Private company.
00:46:51.000 Do you intend on staying that way?
00:46:54.000 It seems like that's the only way you could avoid influence.
00:47:00.000 I'm not sure that's true.
00:47:03.000 I think that, I mean, the thing that, success for Substack looks like being an independent company, right?
00:47:08.000 We're trying to bring this thing into the world that's new, and we think that it's got a real business model that works.
00:47:13.000 We think we're onto something important, and the way that we can best serve that is staying independent and running it ourselves and making it into the best thing that it can be.
00:47:23.000 And I think at some point, you know, you can go public and do that, and there's ways to do it that are not Don't subject you to the kinds of pressures.
00:47:34.000 How could you do that, though, if the whole business model is about...
00:47:38.000 I mean, if it's a public company and people buy stock in the company, you have an obligation to your stockholders to make a maximum amount of money.
00:47:46.000 Yep.
00:47:48.000 And this is actually maybe at the core of how I think about Substack.
00:47:52.000 One way you could say this is, well, we have a choice.
00:47:56.000 Either we can do the things that make us money, or we can do the things that we think matter.
00:48:00.000 And we're just going to be really good, virtuous people and ignore all that money and just do the things that matter.
00:48:06.000 And I think a better solution to actually making change is to find a way to set things up so that in order to make money, you want to do the things that matter.
00:48:16.000 Right.
00:48:18.000 I'm sorry.
00:48:19.000 You're offering yourself as a financially viable solution.
00:48:26.000 There's obviously a market for this.
00:48:28.000 And not just viable, but compelling.
00:48:30.000 There's ways that you can set it up where the things that we do to grow and the things that we do to be successful are also the things that make the ecosystem good and make the writer successful.
00:48:39.000 One example of this is on Substack.
00:48:41.000 As a writer, you own all your content.
00:48:44.000 You own your mailing list.
00:48:45.000 You have a direct billing relationship with people.
00:48:48.000 And if you want, you can leave.
00:48:50.000 And people do leave.
00:48:51.000 And it's terrible for us.
00:48:53.000 And we hate it.
00:48:54.000 People have left?
00:48:55.000 People have left.
00:48:55.000 And what do they do?
00:48:56.000 They just leave and they start their own website?
00:48:58.000 They use one of the other clones.
00:49:00.000 Some of them eventually come back.
00:49:02.000 I didn't know that there's people who have cloned Substack.
00:49:05.000 You don't have to mention names, but how many of them?
00:49:08.000 There's a handful.
00:49:09.000 Twitter and Facebook both copied us very shamelessly, which is one of these things that happens when you're making something.
00:49:17.000 Oh yeah, you can subscribe to people on Twitter now, right?
00:49:20.000 Is that real?
00:49:22.000 Like you can pay money to get better tweets or something?
00:49:24.000 Yeah, they have super followers.
00:49:26.000 Super followers?
00:49:27.000 Is that really what it's called?
00:49:29.000 Yeah, I think that's...
00:49:30.000 That's really what it's called?
00:49:30.000 That's a real thing they did.
00:49:32.000 The point, though, is that...
00:49:35.000 By tying our hands in this way, the fact that the people on Substack can leave means that the only way to make money and grow is to make it good enough that they choose to stay.
00:49:46.000 Other companies are like, we'll lock you in.
00:49:47.000 We'll make it so that you can't leave.
00:49:50.000 That's great.
00:49:50.000 But for us, we're like, well, you can leave.
00:49:52.000 Therefore, we have to actually do the work to keep you.
00:49:55.000 That means that in order to succeed, we have to We have to do the right thing.
00:50:00.000 And I think that's the way that you actually make change in business at scale.
00:50:05.000 It's not by being like, we're angels.
00:50:08.000 We'll turn down the money.
00:50:09.000 We'll do this nice thing.
00:50:10.000 It's like, no, let's figure out a thing that actually works and makes financial sense and does something that matters.
00:50:16.000 Well, when you say turn down the money, your business model is entirely dependent upon people enjoying and subscribing to these journalists.
00:50:25.000 Exactly.
00:50:26.000 And you don't have advertising.
00:50:29.000 That seems to be where the pressure comes in, is when advertisers either don't like content or they don't like particular points of view that people are espousing.
00:50:42.000 Yeah, and I actually think that advertising, a particular kind of advertising, is the root problem that created a lot of these dynamics on the social media networks, where it's not just advertising, but it's Programmatic advertising,
00:50:59.000 right?
00:50:59.000 Where, you know, I'm not buying an ad on Joe Rogan necessarily, but I'm buying an ad to show to like Jimmy Smith, this person who I can target minutely.
00:51:09.000 And so all of Twitter and Facebook and all these things, they sell these things that you can target down to the level of the person.
00:51:14.000 And so the thing that the platform's ultimately aggregating is just a bunch of attention that maximizes how much of people's time that you have.
00:51:24.000 It doesn't actually matter what they were watching in between.
00:51:26.000 The advertiser doesn't know or care unless it's embarrassing for them and they want to cause some stink.
00:51:32.000 But that dynamic, the fact that that business model works so well and then they're doing the things that make them money and it pulls in a bad direction is why we are the way they are.
00:51:43.000 It's not like the people at Twitter are evil.
00:51:44.000 Right.
00:51:45.000 They're not trying to do bad.
00:51:47.000 No, I think they think they're doing good.
00:51:49.000 I genuinely do.
00:51:51.000 And they probably have done a lot of good.
00:51:52.000 I don't want to be so down on Twitter that we can't acknowledge that there's good things about it.
00:51:56.000 There definitely are.
00:51:59.000 Yeah.
00:51:59.000 No, for sure.
00:52:01.000 Listen, it gets out information.
00:52:02.000 It's like if you want to find out what's going on, it's like the best place to go to immediately to find out like, you know, some country got overthrown, some chaos is happening somewhere in the world.
00:52:13.000 At least you're getting something.
00:52:14.000 And you also you're getting various perspectives.
00:52:17.000 You're getting some boots on the ground perspective.
00:52:19.000 You're getting some official perspectives.
00:52:22.000 It's just the problem is sorting it out.
00:52:26.000 It's not a good platform for sorting out what's accurate and what's not accurate.
00:52:30.000 It's like you just get, okay, let's see how this plays out.
00:52:33.000 You get a little piece of the information like, okay, let's see what the real thing is.
00:52:38.000 Every new technology fucks things up.
00:52:41.000 Every time we make something new, there's also terrible consequences that we have to grapple with.
00:52:48.000 When they came up with the printing press, it broke the world.
00:52:51.000 You got the Protestant Reformation.
00:52:53.000 It was a whole thing.
00:52:54.000 And I think we're living through one of those things where we've got this, like, for the first time very quickly, every human being in the world is wired together into one giant network and, like, paying attention to this thing is insane.
00:53:07.000 And we shouldn't expect that that goes over smoothly and everything just works perfectly the first time.
00:53:12.000 It's going to make a mess and we're going to have to figure out how to make it serve us.
00:53:16.000 It's kind of wild that there's really only one, though.
00:53:21.000 Speaking about Twitter, it's kind of wild that there's really one place where people go to bitch about things.
00:53:28.000 You know?
00:53:29.000 I mean, it really is.
00:53:30.000 I guess people go to Facebook, too, but I don't read that either.
00:53:34.000 Every time I go to Facebook, it's like these long diatribes and people screaming in the comments.
00:53:40.000 It's kind of the same thing, but Twitter in that short format of tweeting and quote tweeting and that kind of thing, it seems like it's bizarre that no one has replicated YouTube successfully.
00:53:55.000 As much as you might not like the algorithm, it's fucking genius in that it really does captivate people.
00:54:05.000 And it really is dependent, like my friend Ari did a test once where all he looked up was videos on puppies.
00:54:12.000 That's all he looked up.
00:54:13.000 And he goes, what do you know?
00:54:14.000 My fucking algorithm's all puppies.
00:54:16.000 And he goes, it's really, the problem's not the algorithm, the problem is people.
00:54:20.000 And that really is, I mean if you go to my, my YouTube feed is mostly nonsense.
00:54:26.000 In that it's mostly mindless things that I enjoy watching.
00:54:30.000 Professional pool and hot rods and stuff like that.
00:54:33.000 Mostly stuff that's non-consequential.
00:54:35.000 Then occasionally some deep dive into some world economic forum conspiracy.
00:54:41.000 I'm like, oh fucking Christ.
00:54:43.000 It's not bad to have stuff that's fun though.
00:54:45.000 We don't want a world with no puppies.
00:54:47.000 That shouldn't be the only way that we find out everything about the world.
00:54:50.000 But it's just fascinating that these algorithms – you've seen the Social Network documentary?
00:54:56.000 I mean, I think what those folks have kind of exposed that worked in these social media companies is that they knew what these algorithms were going to do, and they did it anyway.
00:55:08.000 And they know where this is going, and it seems to be going in a terrible way.
00:55:13.000 It looks like civil war or some sort of horrible divide of our country just based on human nature applied to this very disruptive technology.
00:55:27.000 Yeah.
00:55:27.000 It's scary shit.
00:55:29.000 It really is.
00:55:31.000 I mean...
00:55:31.000 This stuff was going through my head when I did the stupid essay in 2017. And at the time, it was like, even saying that out loud, even being like, there could be a civil war was like, felt insane.
00:55:40.000 Insane.
00:55:41.000 But I'm like, but look at like, just play the movie forward.
00:55:44.000 Like, what happens?
00:55:45.000 Tim Pool said that on my podcast years ago, and I thought he was just going way too over the top.
00:55:51.000 He said, I think we're going to Civil War.
00:55:52.000 I'm like, what the fuck?
00:55:53.000 Come on, man.
00:55:54.000 Relax.
00:55:55.000 But now, I'm like, ooh, maybe he was right.
00:55:59.000 Let's not, though.
00:56:00.000 It would be better if we could not.
00:56:01.000 No, for sure.
00:56:02.000 By a wide margin.
00:56:04.000 By a wide margin.
00:56:05.000 Let's not.
00:56:05.000 But at least we are most certainly in some sort of a battle of ideas that...
00:56:13.000 Is so uncharitable and so rigid in its sides and its ideologies.
00:56:23.000 It fucking freaks me out.
00:56:25.000 The lack of nuance and perspective and the lack of objectivity in recognizing the flaws of both sides.
00:56:34.000 I mean, you obviously see that in politics, right?
00:56:37.000 You're always going to see that where the people that are in the positions of power, whether it's the White House press secretary, they're always going to give you the best possible spin on everything they can.
00:56:49.000 And when the questions get weird, they end the conversation and they leave.
00:56:53.000 They're just trying to propagandize, just trying to promote a certain thing.
00:56:56.000 The trap is believing that it's a battle between these two teams.
00:57:01.000 It's like the left versus the right and that the other team is so bad that whatever our team does to fight them is necessary and justified.
00:57:07.000 That's the trap.
00:57:09.000 And the reality is it's like everybody who's sane versus...
00:57:14.000 Actually, a tiny minority of people who are nuts, but who have massive, amplified power because of the way these dynamics work and the way that institutions have played out in the whole thing.
00:57:25.000 Also, the things that they're talking about, they're consequential.
00:57:29.000 When you have a small amount of people that are arguing about things or yelling about things, These things are consequential and these are these little battlegrounds that these ideas play out on and they recognize the significance of these battlegrounds and they put all their time into it.
00:57:47.000 And then you have a lot of people that are just like genuinely mentally ill because of these platforms.
00:57:51.000 I think it's ramped up anxiety at an unprecedented level.
00:57:56.000 I mean, and there's some people that it carries over.
00:57:58.000 I have some friends that are on Twitter too much and then we'll go out to dinner and they carry Twitter out into the regular everyday life.
00:58:06.000 This is what's wrong with it.
00:58:08.000 Like, here we are right now.
00:58:09.000 It's Austin.
00:58:10.000 It's beautiful.
00:58:11.000 We're having dinner.
00:58:11.000 We're having a good time.
00:58:12.000 And you're freaking out about some argument that people are having about something that has literally nothing to do with you.
00:58:19.000 Yeah, I find myself doing it, because I'm sort of addicted to Twitter as well.
00:58:22.000 I go through a phase where I delete the app, and then I get it on my browser or my phone, and then I hate myself, and then I delete it again.
00:58:28.000 And when you get into it, you talk to someone who's just a normal person in the world, and you find yourself saying something about what's happening like it makes any sense, and you're like, what the hell am I talking about?
00:58:39.000 This is literally insane, and I couldn't realize that it was insane until I just talked to someone that just had no idea what the hell.
00:58:47.000 But on the flip side of that, I felt that same way about all the woke chaos that was coming out of universities in like 2014. We were talking about it in 2014 and 2015 and people were like, why are you focusing on this?
00:58:59.000 This has zero effect on your life.
00:59:02.000 And my take was this is going to spill in.
00:59:05.000 These kids are going to graduate.
00:59:06.000 Why are you focusing on a virus that's only in one little Chinese city?
00:59:08.000 Exactly.
00:59:09.000 There's a new one to worry about now.
00:59:10.000 Have you heard about the new one?
00:59:12.000 Which one?
00:59:12.000 Oh, the one just came out.
00:59:14.000 Have you seen that, Jamie?
00:59:16.000 I'll send it to you.
00:59:17.000 New virus just dropped?
00:59:18.000 Yep, new virus just dropped.
00:59:20.000 Good times.
00:59:22.000 Here, it's 35 people have been affected by this thing.
00:59:26.000 I'll send it to you right now, John.
00:59:27.000 The Langya?
00:59:30.000 Yes.
00:59:31.000 Langya virus.
00:59:33.000 China sounds alarm.
00:59:35.000 Here, I'll send it to you.
00:59:36.000 It is the Daily Mail.
00:59:38.000 It's been reported by tons of places.
00:59:40.000 Oh yeah, I'm sure it's real.
00:59:42.000 Here it is.
00:59:43.000 So it's from Shrews.
00:59:46.000 Well, what does this do?
00:59:47.000 Is it like a monkeypox-type deal that sucks, but it doesn't kill you, or is it a real problem?
00:59:54.000 A virus known to kill up to 75% of cases.
00:59:57.000 Okay, but of what?
00:59:59.000 Shrews or of humans?
01:00:00.000 None of the cases in two Chinese provinces so far resulted in people dying.
01:00:04.000 That's what I wanted to know.
01:00:05.000 So 35 people got it, no one's dead.
01:00:07.000 Experts believe the virus is passed on by animals, including shrews.
01:00:11.000 Doctors have raised the alarm over a brand-new virus.
01:00:13.000 You know, I'm fucking terrified that we've become conditioned now to start freaking out about every virus that comes our way.
01:00:21.000 Because we've always had swine flu and avian flu.
01:00:25.000 Yeah, you pay attention now.
01:00:26.000 You know what would be good, though, is if we actually took the lessons we learned from coronavirus and prepared correctly for the next thing.
01:00:34.000 Well, we need a Substack for medicine.
01:00:37.000 That would be a good idea.
01:00:38.000 I mean, obviously you have it, and there's some great doctors and clinical researchers that do post on Substack.
01:00:45.000 I've read a lot of their work.
01:00:47.000 The problem is there's great consequences in those industries if you step outside the lines and you talk about things that are unpopular.
01:00:58.000 And that's one of the real positive things about Substack, is you do give people, if they get cast out of these institutions, you give them a very viable and often better alternative.
01:01:11.000 And now, because of the popularity of Substack, there's a real good argument that they wouldn't just reach the same amount of people, they reach more people.
01:01:20.000 Particularly if these things get promoted by other people like Barry or other journalists that are very popular in Substack.
01:01:29.000 And there, you know, there is a cost to that, right?
01:01:32.000 Sometimes people get cast out of places because they're nuts and they're wrong and they're crazy.
01:01:37.000 Like, that does happen, right?
01:01:39.000 And so if you have a platform where people can publish, like, you're going to get some crazy people.
01:01:45.000 But you're also going to get the people who are cast out for the wrong reasons.
01:01:49.000 And this is why the thing where...
01:01:53.000 Choosing which human beings you trust on Substack, building a relationship over time with people, making your own guess of their integrity, and then being able to find out who's reading what, who's sharing what, who's promoting what, I think is a better answer to how can we get all of the points of view out,
01:02:12.000 even though it's still imperfect.
01:02:14.000 There's still going to be downsides to any hack you can take with this stuff.
01:02:18.000 What's the big dispute in the Substack office when you guys discuss these things?
01:02:23.000 Is everybody on the same page?
01:02:26.000 I think mostly.
01:02:27.000 I mean, one thing that we've done fairly carefully is that we've...
01:02:32.000 We know what our stance is on these things, and we've written down...
01:02:36.000 We've, like, written...
01:02:37.000 Hamish and I and Jay have written essays about, like...
01:02:41.000 Excuse me.
01:02:46.000 We got a...
01:02:47.000 There's a little cough button there, too.
01:02:49.000 I love using that thing.
01:02:50.000 Oh, that's smart.
01:02:53.000 I love the cough button.
01:02:55.000 Makes me feel like we're a real show.
01:02:57.000 That's a genius invention.
01:03:00.000 You know, we wrote down this stuff.
01:03:01.000 We made it public.
01:03:02.000 And we made it public before we were in some controversy or people were mad at us.
01:03:06.000 We sort of take the time to think, what do we believe?
01:03:10.000 Why do we believe it?
01:03:11.000 Why are we working on this thing?
01:03:12.000 Why is it worth working on?
01:03:13.000 And then when people join the company, they know that stuff.
01:03:17.000 And if you come to Substack and you're like, actually, I think you should, you know, not give people a platform and not put writers in charge and have this tight view of what's real, it would just be crazy because it's just not putting the work into something that's against what you believe in doesn't make sense.
01:03:33.000 Also, it would be kind of like managing its scale.
01:03:36.000 Like, how could you?
01:03:37.000 When you have tens of thousands, so that means you would have to have tens of thousands of people going over everyone's stuff, making sure that it's accurate and it doesn't promote some harmful narrative.
01:03:51.000 And those people end up making mistakes, and the mistakes are really unfair.
01:03:54.000 Right.
01:03:54.000 And I think people have had this experience at this point.
01:03:56.000 There's been lots of people that just have had, like, Yeah.
01:04:15.000 Managing at scale, I try to explain this to people when they shit all over YouTube.
01:04:21.000 I'm like, could you imagine being YouTube?
01:04:23.000 Just the sheer volume of videos that get produced every day is insane.
01:04:30.000 The sheer amount of- Like how many hours of video per second are they getting?
01:04:34.000 It's pretty crazy.
01:04:35.000 Jamie, what is the...
01:04:37.000 Well, we could probably just get like hours per day.
01:04:42.000 The amount of hours per day that YouTube...
01:04:45.000 And obviously there's multiple languages and so you get into that and like good luck.
01:04:51.000 Because, you know, how many translators are you going to have in all these different countries that are going to read all these or watch, read transcripts or watch videos?
01:05:00.000 As of July 3rd, there is 500 hours of videos uploaded per minute.
01:05:11.000 That doesn't seem like that much.
01:05:12.000 Per minute, though.
01:05:13.000 I know.
01:05:14.000 It adds up, obviously.
01:05:16.000 500 hours, though.
01:05:17.000 I know.
01:05:17.000 It's a lot.
01:05:18.000 Yeah, it's definitely a lot.
01:05:20.000 It's a ton.
01:05:21.000 And what is it going to be in five years?
01:05:23.000 It's probably going to be ten times that.
01:05:28.000 Do you read anything on Substack?
01:05:30.000 Do I read anything?
01:05:31.000 I read Barry's.
01:05:32.000 I read Alex Berenson.
01:05:35.000 I read ones that get recommended to me.
01:05:37.000 Generally it's by clinical physicians or doctors or people reporting on life extension stuff and health things and things along those lines.
01:05:48.000 I love it for that.
01:05:50.000 It's great for me because I really feel like I'm getting the perspective of the writer, which I really enjoy, because that's what I really love about podcasts as well, is that I'm getting a clear...
01:06:03.000 I like when I know that it's coming from the person, that I'm getting this individual's mind translated into words.
01:06:13.000 They might be right.
01:06:13.000 They might be wrong.
01:06:14.000 Yeah, but I'm getting it from them.
01:06:16.000 With someone like Matt Taibbi, I love the way he writes.
01:06:19.000 I love the way his mind works.
01:06:21.000 When he writes, I'm getting his writing.
01:06:23.000 I'm getting his thought process and his editing.
01:06:26.000 It's a piece of art.
01:06:27.000 It's interesting.
01:06:28.000 It's an interesting form of art because it's an art that focuses on thought process and all of this person's life experiences and education and And how it translates in them trying to broadcast this to people.
01:06:46.000 It's cool.
01:06:47.000 I think Substack is one of the most important things that's ever happened to journalism in my lifetime because it's a free portal, a new method of distributing content that just is very exciting.
01:06:59.000 When I first found out about it, and then when Barry left the New York Times and went over there, I was like, ooh.
01:07:04.000 I'm like, this is exciting.
01:07:06.000 And one thing that's new about it, I think, is everybody thinks of the people, they think of Barry Weiss, like famous journalist that I know that left X and came to Substack, and that's the idea that people have in their mind of who's on Substack.
01:07:19.000 But there's a growing set of people that I think is much larger over time of people who weren't writers, right, who had something to give the world as a writer, as a thinker, But didn't see a path to doing that in what existed before.
01:07:37.000 But for Substack would have gone to law school like their parents wanted them to or whatever.
01:07:41.000 And so you're starting to see people on Substack who become professional writers on the platform.
01:07:46.000 And you start to get perspectives that otherwise never would have existed.
01:07:50.000 I mean, you see this with some of the doctors that are very interesting.
01:07:53.000 You see people from other industries.
01:07:54.000 And there's just people growing up that Like, have something to give the world and would not have been able to, like, write if they didn't have this.
01:08:04.000 Yeah, that's the parallel to podcasting.
01:08:07.000 Exactly.
01:08:07.000 Because the barrier to entry is so small, particularly for audio podcasts.
01:08:11.000 It's so easy.
01:08:12.000 You literally can press the Voice Notes app on your phone and you could, bam, make a podcast.
01:08:17.000 Which means there are a ton of terrible podcasts that nobody listens to.
01:08:20.000 Sure.
01:08:21.000 But that's okay.
01:08:21.000 That's okay.
01:08:22.000 I mean, it's the same thing with YouTube videos.
01:08:24.000 I mean, how many people who make YouTube videos are literally just using their webcam and just talking to it?
01:08:30.000 And they might have fucking five million subscribers.
01:08:33.000 It's pretty wild in that regard.
01:08:36.000 The barrier for entry is much lower, so you're going to have a lot more nonsense content and bullshit content.
01:08:44.000 But you're also going to have people that maybe have something to offer and didn't really think that it was possible for them before, which is me.
01:08:53.000 I mean, that's how I became a podcaster.
01:08:55.000 I never thought anybody was going to give me a fucking radio show.
01:08:58.000 If I had to try to To pitch the model of what the JRE is to some company.
01:09:06.000 They would have kicked me out of the office a long time ago.
01:09:08.000 There's no way.
01:09:10.000 I want to just have people I'm interested in talking to.
01:09:14.000 I don't give a fuck if they're famous.
01:09:16.000 What do I find interesting?
01:09:18.000 Let's talk to those folks.
01:09:20.000 Do you believe in audience capture?
01:09:23.000 Yeah, that's real.
01:09:25.000 I've seen it with comedians, for sure.
01:09:27.000 I've seen it with various people that are Whether they're commentators or opinion makers, whatever they are, they seem to find there's a thing where it gets them the most amount of juice,
01:09:43.000 the most amount of traction, and they lean into that.
01:09:45.000 They get love and support.
01:09:47.000 I've seen people convert political ideologies because of it, because it seems like the right is pretty good at that.
01:09:54.000 It's really interesting.
01:09:55.000 Fox News has said more positive things about me than any left-wing company.
01:10:02.000 And I think that could be a problem for people is they do switch their ideology because they find they're getting a certain amount of love from one direction or the other.
01:10:14.000 I was reading a good Substack piece about this yesterday.
01:10:18.000 The example he used was this YouTuber who was this kind of normal kid.
01:10:26.000 And he has a picture at the start of this skinny guy that made these videos and found a niche like eating dinner and talking to the camera and people started to watch it.
01:10:36.000 And over time, the end picture is, like, years later, he's this famous guy who's got millions of subscribers, making all this money, but he's, like, just destroyed.
01:10:45.000 He's, like, this huge, morbidly obese, terrible health problems thing.
01:10:51.000 And this post makes this convincing argument that the way he got there was this process of capture, and it wasn't...
01:11:01.000 It wasn't like this guy was like, I know, I'm going to become this horrible train wreck in order to make money and be famous, and that's a good idea, so I'm going to do it.
01:11:09.000 But it's the thing that happens where you get these subtle cues.
01:11:13.000 It's kind of the same way that you become the average of the people you hang out with.
01:11:17.000 And so you've got to choose who you hang out with intelligently because you can't resist that.
01:11:22.000 And when you're on a platform, even if you don't want to, you get these signals of like, what's working?
01:11:28.000 What's making people watch this?
01:11:29.000 What's making the comments?
01:11:30.000 What are people asking me for?
01:11:31.000 And it makes a pretty convincing case that That thing can, like, go really, you know, destroy this guy's life, I think.
01:11:39.000 Well, an argument against that is Mr. Beast.
01:11:42.000 You know, Mr. Beast is the biggest YouTuber, and he could not be a nicer guy.
01:11:48.000 And his whole thing is about figuring out what captures people's attention.
01:11:55.000 But it's all super positive.
01:11:57.000 He's very charitable.
01:11:59.000 He has these turkey drives where he gives away free turkeys for people for Thanksgiving.
01:12:05.000 He makes this fun video about it.
01:12:08.000 He runs a food bank.
01:12:10.000 He does so much for charitable organizations, but yet he's 100%...
01:12:19.000 Driven in his in his idea that I want to make a video that reaches the most amount of people possible So what can I do in terms of like editing?
01:12:30.000 What can I do in terms of the caption?
01:12:32.000 What can I do in terms of the image that I use and he's very Meticulous about that, but yet is not in fact he's become more charitable more nice more friendly more happy and It's almost like there's a good version and a bad version of it.
01:12:48.000 Yeah.
01:12:49.000 If you let yourself get captured by people who want you to be the best version of yourself, that's good.
01:12:54.000 Right.
01:12:54.000 If you're getting more charitable to be massively successful, that sounds great.
01:12:59.000 Oh, yeah.
01:13:00.000 But I don't think that's even what he's doing.
01:13:03.000 He's a very interesting guy because he's very young and very wise for a young guy.
01:13:08.000 But also, like...
01:13:10.000 Silly, he's fun, and he's figured out a way to be successful and still maintain who he is.
01:13:17.000 That's where it gets tricky with people, because some people don't have a rigid foundation.
01:13:22.000 They don't have a strict set of ethics and morals and a code that they live by, and so then anything that's successful, they gravitate towards that like metal filings towards a magnet.
01:13:38.000 And the irony of that, no pun intended, is that you're probably less successful in the long run.
01:13:43.000 Right?
01:13:43.000 If you have the principles and you know what you're doing, ultimately that's the thing that people, if that thing is right, that's the thing that people will follow you for.
01:13:51.000 That's the thing that people care about.
01:13:52.000 It depends on how you define success, right?
01:13:53.000 If you define success in terms of popularity, there's a lot of terrible people that are popular.
01:13:58.000 You know, because they're popular for talking shit or being mean.
01:14:02.000 Or they're popular for, you know, causing fights and creating drama.
01:14:06.000 And there's a real currency in drama, unfortunately.
01:14:09.000 There's a lot of people that that's their main thing is just being shitty.
01:14:17.000 So there's a bunch of different ways you could go.
01:14:20.000 But audience capture, yeah, it's real.
01:14:23.000 And I try very hard to make sure that I don't get sucked into any of that.
01:14:31.000 That's why you only go on Twitter twice a day.
01:14:33.000 Yeah, I mean, if I go on, I go on for a couple seconds, really.
01:14:38.000 I go on every now and then to check stuff and see what's happening, what's trending, and then I go, ugh, let me get out of here before I read something about me.
01:14:49.000 I just think that human beings are very malleable.
01:14:54.000 We're very easily influenced whether we like it or not.
01:14:57.000 That's why cults exist.
01:14:58.000 You just got to be meticulous about the way you think about things and also you have to spend a lot of time alone thinking.
01:15:06.000 Like genuinely alone, no electronics, thinking about how you view life, how you view yourself, how you view all the various projects you have that you're doing and what are you doing with them?
01:15:22.000 Why are you doing it?
01:15:24.000 Right.
01:15:25.000 The magic is not, I'm going to be immune to the influence of everything that I read and see.
01:15:29.000 The magic is, at some point, I control where I spend my time and attention.
01:15:33.000 I control whether I'm going to go for a walk or who I'm going to listen to and spend my time with.
01:15:38.000 And by choosing that intelligently, I can shape who I am, what I think.
01:15:43.000 That's the value of discipline.
01:15:44.000 You have to have discipline in any sort of thing you're doing, especially any sort of high-pressure thing.
01:15:50.000 You have to have discipline.
01:15:51.000 You have to understandings.
01:15:53.000 You have to have an understanding of exactly why you're doing it and what you're doing.
01:15:59.000 And some people don't.
01:16:00.000 Some people just lick their finger and which way is the wind going?
01:16:03.000 I'm going that way.
01:16:04.000 And you can be successful that way, too.
01:16:07.000 There's a lot of people that are grifters.
01:16:09.000 They're successful just grifting.
01:16:14.000 Some of them, they have a large amount of people that hate them, but there's enough people that pay attention that it pays the bills and they keep going.
01:16:25.000 I'm sure you have that on Substack too, right?
01:16:28.000 It's an index fund of the whole internet.
01:16:30.000 You get everything.
01:16:31.000 We wouldn't be successful if there wasn't something for every niche.
01:16:36.000 Why did you ask about audience capture?
01:16:38.000 It was something I read yesterday that was really interesting and something I think about a lot.
01:16:43.000 There's a lot of things about the Substack model that I think are magical and good.
01:16:47.000 The fact that your readers are the ones hiring and firing you, the fact that you can actually make money from subscriptions, the fact that a relatively small number of people that really like you can make you a living or even a fortune from subscriptions.
01:17:00.000 All that stuff is great.
01:17:01.000 One of the things I do sometimes think about is, well, Is this a recipe for audience capture?
01:17:07.000 Is this a thing where if I notice what's going to get me subscribers, can I get pulled into this thing?
01:17:14.000 I think a lot about, is the subject model helping people be the best versions of themselves?
01:17:19.000 Is it helping writers become the best Do the work that they actually believe in.
01:17:25.000 And if it's making the money, if it's helping hone their craft, if it's like working with their discipline, that's good.
01:17:30.000 If it's pulling them to become the wrong thing, then that's bad.
01:17:35.000 And I think there is some of that, to be honest.
01:17:37.000 I think that's not a zero thing.
01:17:39.000 But I think it's not unique to Substack.
01:17:40.000 I think every platform has that.
01:17:41.000 And it's kind of a question of like, how do you...
01:17:44.000 You can't avoid those forces.
01:17:46.000 So it's a question of how do you harness them for something positive?
01:17:52.000 Yeah, or how do you avoid changing in a way you don't want to change?
01:17:59.000 Like what is it about audience capture that's so compelling?
01:18:03.000 I guess people want acceptance and they want love and when they find it's generally going in this certain direction and they get positive responses, they tend to lean into it.
01:18:17.000 I think the best writers are often Quite disagreeable.
01:18:22.000 Best writers, best journalists are often the people who kind of, like, poop in the punch bowl in social settings sometimes, who are willing to, like, have the strongest natural personality that goes against those urges.
01:18:34.000 But everybody's human, right?
01:18:36.000 Everyone has human nature at some point.
01:18:38.000 And it becomes really powerful.
01:18:41.000 I don't know.
01:18:41.000 You seem to be someone that has contended with this stuff at a scale that almost nobody has.
01:18:46.000 And it's, yeah, I'm just curious how you think about it.
01:18:50.000 Yeah, worry about it.
01:18:51.000 I mean, maybe that's why I haven't gotten sucked into it, you know?
01:18:55.000 I mean, also...
01:18:58.000 I don't know.
01:19:01.000 I mean, it's gonna sound crazy, but I really think that this thing created itself.
01:19:07.000 I feel like it just tricked me into doing it.
01:19:10.000 I really do.
01:19:11.000 Sometimes.
01:19:12.000 I just feel like...
01:19:14.000 Sometimes I feel like...
01:19:20.000 Didn't know that this is what I wanted to do until I started doing it and then when I started doing I was like oh and then the more I do it the more I feel like I'm just sort of showing up and and and just Turning on the antenna and letting it happen and then bringing in all these people that I find interesting and And then all these other people that listen also find those people interesting and then they have this hunger for it.
01:19:47.000 And then that sort of, that excites me.
01:19:51.000 And then I hear from so many people that got inspired to do different things with their life, to maybe start exercising and eating well and also recognizing the effect that that has on their mental state and just Seeing the way I interact with my friends and I'm very fortunate that I have a really good group of friends.
01:20:16.000 Everyone's really fun and smart and supportive and we laugh a lot and that also encourages people to seek that out in their own lives and to have that kind of interactions or those kind of interactions with other people that they care about and it inspires similar kinds of conversations and also similar podcasts.
01:20:34.000 There's a lot of Podcasts that were inspired because of this.
01:20:38.000 That's exciting to me.
01:20:40.000 And so I feel like I have this obligation to make sure that I'm not fucking it up.
01:20:45.000 But I really do feel like it made itself.
01:20:48.000 I know it sounds crazy, but maybe it's a way of alleviating responsibility on my half.
01:20:55.000 Maybe I just think, oh, I didn't even do it.
01:20:57.000 It's doing itself.
01:20:58.000 It's something that needed to exist and you just happened along and were the perfect person in the perfect place.
01:21:04.000 Yeah, but I felt compelled to do it.
01:21:07.000 In the beginning, I was making no money and my wife was like, why are you doing this?
01:21:10.000 My friends were saying that.
01:21:12.000 They were like, why are you doing this?
01:21:13.000 I'm like, I don't know, man.
01:21:14.000 I feel like I need to do it.
01:21:15.000 I just feel like it's something.
01:21:17.000 It's fun.
01:21:18.000 I enjoy doing it.
01:21:19.000 But I also feel like it's drawing me towards it.
01:21:25.000 I know it sounds grandiose, but it does...
01:21:27.000 I don't know.
01:21:28.000 Maybe that's what everything that's successful feels like.
01:21:30.000 It feels like, oh, this is supposed to happen this way.
01:21:34.000 It's like it's pulled me into it.
01:21:37.000 I feel a little bit like that with Substack.
01:21:39.000 We're not at the same scale of success, but I didn't set out to do this.
01:21:41.000 I was doing something else, and it sort of became this undeniable thing.
01:21:46.000 And it's like, well, if that could exist, it needs to exist.
01:21:50.000 How can we not do this?
01:21:52.000 Right.
01:21:53.000 But doesn't that always happen in the world, though?
01:21:55.000 I mean, that's one of the things about human beings, is that human beings, they encounter a dilemma, and then a solution to that dilemma becomes inescapable.
01:22:04.000 They feel like they have to do this.
01:22:07.000 Do you think if you got hit by a bus the day that you died somehow, the day that you were going to start the show, that someone else would have made something similar to this?
01:22:15.000 Or do you think history would be totally different?
01:22:19.000 That's a good question.
01:22:21.000 Yeah, someone would have done something similar.
01:22:23.000 I don't know.
01:22:25.000 Maybe not that similar.
01:22:26.000 I don't know.
01:22:28.000 I'm often shocked that someone hasn't done it exactly the same way.
01:22:34.000 It's not that hard to do.
01:22:35.000 Just talk to people.
01:22:37.000 It's literally one of the easiest things.
01:22:39.000 But there's people that are inspired, so they set out to copy it, and they just don't copy what's important?
01:22:45.000 I think you have to have genuine curiosity.
01:22:50.000 I think that's a...
01:22:51.000 I mean, if I really wanted to break...
01:22:53.000 If I was a journalist trying to break down the success of my show, I would probably say genuine curiosities.
01:23:00.000 The most important factor.
01:23:02.000 And also a wide range of interests.
01:23:06.000 You know, I'm interested in all kinds of things.
01:23:09.000 So when I talk to someone, I'm genuinely curious.
01:23:12.000 It's like, what was it about this that was so compelling to you?
01:23:16.000 Why'd you start?
01:23:17.000 Did you have doubts?
01:23:18.000 Like, what were those doubts?
01:23:19.000 And what was fueled by...
01:23:20.000 Was there any time you thought about quitting?
01:23:22.000 What is it?
01:23:24.000 What's going on in your head?
01:23:25.000 And what can I get out of that that enhances my focus or my perspective?
01:23:32.000 Yeah, and then, you know, don't be a douchebag.
01:23:37.000 Easier said than done sometimes.
01:23:39.000 Yeah, that's very easier said than done sometimes.
01:23:41.000 Because there's also a lot of pressure that comes with things and pressure makes people, you know, act and behave in shitty ways sometimes because they're just overwhelmed.
01:23:51.000 Does any of the flack you take discourage you?
01:23:54.000 There's like people are mad on the internet, people are mad at Spotify or whatever place.
01:23:58.000 Does that like slow you down?
01:23:59.000 No, not that flack.
01:24:00.000 No.
01:24:00.000 No, that just makes me laugh.
01:24:02.000 Well, okay, so it sounds like there is flack that would discourage you.
01:24:05.000 Yeah, if I genuinely did a bad thing.
01:24:10.000 Like, if I genuinely...
01:24:11.000 Like, if I... Said something mean and incorrect about someone or if I did, you know, that would discourage me.
01:24:19.000 And then someone said something like, oh, you shouldn't have said that because of this.
01:24:24.000 And I'm like, oh, goddammit, they're right.
01:24:25.000 But I would just say it.
01:24:27.000 I would just correct myself.
01:24:28.000 I would just come on and apologize and say, this is why I thought this.
01:24:33.000 And it just turns out to not be true.
01:24:35.000 And then, you know, I didn't mean to hurt people's feelings.
01:24:39.000 But that's also part of just being a human being and communication.
01:24:45.000 And also it's impossible for me to...
01:24:49.000 You know, you're not going to do...
01:24:52.000 It's not going to be perfect.
01:24:54.000 It's just...
01:24:54.000 That's one of the things that's interesting about conversations.
01:24:56.000 This is...
01:24:57.000 I don't know the next word out of my mouth right now.
01:25:01.000 And that's what's exciting about it.
01:25:04.000 It's not planned out.
01:25:05.000 And I think when people do see very planned out statements and planned out interviews, they're not...
01:25:15.000 Like Obama did a podcast and, you know, nobody wanted to listen, which is crazy.
01:25:20.000 He's one of the most popular presidents of all time, one of the most interesting people that's ever lived.
01:25:24.000 Nobody gives a fuck.
01:25:25.000 Why?
01:25:25.000 Because what he's talking about is like, listen, he's free, okay?
01:25:30.000 He's gone.
01:25:31.000 He's no longer the president.
01:25:33.000 He's the former two-term president.
01:25:35.000 He's insanely rich.
01:25:36.000 If anybody can say what he thinks, it's that guy.
01:25:39.000 But he can't.
01:25:40.000 But he can't.
01:25:41.000 You know, he's completely...
01:25:44.000 I mean, if Obama just had a podcast and he smoked a joint, started talking shit about things, it would be fucking amazing.
01:25:50.000 It would be amazing.
01:25:52.000 But would that interrupt his ability to make, you know, $400,000 to a speech for bankers?
01:26:01.000 It probably would.
01:26:03.000 Yeah, so, and it'd also fuck with his chance of being a leader in the party, and, you know, people would face all kinds of criticism.
01:26:12.000 You know, Fox News would run some horrible story about it.
01:26:15.000 You can't.
01:26:17.000 Stuck.
01:26:17.000 See, I'm a cage-fighting commentator who's also a stand-up comedian.
01:26:23.000 There's a lot of built-in escape valves.
01:26:29.000 If we wanted to make Substack better for the interesting version of this, for people that are doing the things that you can't predict, like if you are the king of Substack, what do you think we should do?
01:26:40.000 I think you're doing the right thing.
01:26:41.000 I think the way you're doing it...
01:26:42.000 I don't envy the choices that you have to make and the decisions and the complications that must come about through it, but having that steadfast ethic of No censorship,
01:27:02.000 letting people express themselves, and don't have some sort of a tricky algorithm, don't be compelled by advertisers.
01:27:14.000 That's the recipe for success, for what you're doing.
01:27:18.000 What you're doing is a disruptive Journalism outlet where people, you know, like guys like Glenn Greenwald who get kicked out of the very newspaper that he fucking founded, you know, because he wanted to report on the Hunter Biden laptop case,
01:27:34.000 which is wild because now...
01:27:38.000 Here we are two years later.
01:27:39.000 It's fucking true.
01:27:41.000 It's true and it's wild.
01:27:43.000 And it's wilder than we even thought it was because more stuff's coming out and there's pretty clear evidence of corruption.
01:27:52.000 And this would have been a consideration when people were voting.
01:27:57.000 And they were so terrified that Trump was going to win again and he just...
01:28:03.000 He represented in many people's eyes this ultimate enemy, this ultimate evil.
01:28:10.000 And they wanted him out by all means necessary, by any means necessary.
01:28:14.000 And so they were willing to censor legitimate information from one of the oldest newspapers in the country, the New York Post, which was writing about it, which is pretty fucking crazy and scary to me.
01:28:26.000 Because this is all, in my eyes, this is a gradual process.
01:28:31.000 And if you'll accept that, you'll accept more.
01:28:33.000 And if you'll accept that kind of censorship...
01:28:35.000 Yeah, you get used to it and then the next thing and the next thing.
01:28:37.000 Especially if you can demonize the person that you're censoring against.
01:28:40.000 If this is all ultimately to get Trump out of office, well, who the fuck didn't want that?
01:28:45.000 Let's get him out of there.
01:28:46.000 Who cares?
01:28:47.000 Just let that thing come out in February.
01:28:50.000 But in November, no fucking chance.
01:28:53.000 We can't print this.
01:28:55.000 And so those sort of decisions that people make, although they think they're doing the right thing, that's where you have to have these steadfast ethics.
01:29:05.000 You have to have these rock-solid foundational ethics where you are not Going to give in to any sort of peer pressure or any irrational people that seem to think that he's some sort of a threat to humanity and a threat to democracy.
01:29:23.000 And no matter what, you have to make sure that that doesn't happen.
01:29:27.000 And we could work all the rest out later.
01:29:29.000 The problem is, once you agree to that, that's a slippery slope.
01:29:33.000 It's like, that was the thing about the Patriot Act, where they were talking about indefinite detention of people.
01:29:42.000 And Obama was like, you know, I would never use that.
01:29:45.000 Don't worry, I would never use that.
01:29:46.000 Well, okay, but what about the next guy?
01:29:48.000 What about the person after you?
01:29:49.000 And it turns out the next guy was Trump.
01:29:51.000 And we were scared of him using something like that.
01:29:54.000 And we're scared about someone who's worse than Trump.
01:29:57.000 If we can't come up with some sort of a common ground, a middle ground in this country, and agree that we're all a part of a community, that's what we're supposed to be.
01:30:05.000 We're supposed to be the community of the United States of America.
01:30:08.000 That's ultimately what a country is.
01:30:10.000 It's a massive community.
01:30:11.000 We want the best for the greater good, the whole.
01:30:16.000 But...
01:30:19.000 We can't look at it that way and we keep looking at it like there's people that are going to ruin the GOP. I'm the sworn enemy of the GOP. We've got to get them out of office and these fucking liberals are ruined and want to turn everyone into a Marxist and your kids are all going to be trans.
01:30:35.000 If we don't find some sort of a rational common ground in this country, We're going to continue to feed into this.
01:30:43.000 It's going to escalate.
01:30:44.000 It's going to keep going.
01:30:45.000 That's where I'm really worried.
01:30:46.000 I'm really worried that cooler heads have not prevailed.
01:30:51.000 And there's not enough voices that say what I just said.
01:30:55.000 Not enough voices that say, like, we're supposed to be all together.
01:30:58.000 And ultimately, the whole world is supposed to be all together.
01:31:01.000 One of the more beautiful things about the Internet is the Internet...
01:31:04.000 Allows you first of all you can translate things into a million different languages and they you have access to all this information all around the world and more people around the world Should be able to realize that we all share common interests.
01:31:16.000 We all want to have a good life We all want to be able to do whatever we want.
01:31:21.000 We all want to be able to express ourselves honestly We want all want happiness for our families and our loved ones and we all want to be Living in a world that's not fucking polluted and on fire There seem to be pretty common,
01:31:36.000 very important, bedrock foundational ideas that we all agree on worldwide.
01:31:41.000 And then we have people that are making insane amounts of money by sort of hijacking these individual ideas and individual issues that we all seem to find important.
01:31:55.000 Some of it's they're making money and some of it...
01:31:57.000 I think the thing that fascinates me is a lot of the time I think it's good people trying to do their best.
01:32:04.000 Yeah.
01:32:04.000 Screwing it up.
01:32:05.000 Right?
01:32:06.000 There's cases where there's people tensing their fingers and being like, I'm going to profit off the world going to hell in this way.
01:32:12.000 You got to say that in German accent.
01:32:15.000 I don't know if I can do a good enough German accent.
01:32:17.000 You will owe nothing and you will be happy.
01:32:22.000 It's fun that you just say that.
01:32:25.000 But the scary thing is you don't need that.
01:32:28.000 Sure, there's villains out there, but you don't need there to be a villain.
01:32:31.000 This stuff can happen by good people.
01:32:34.000 And you talked about the Hunter Biden thing.
01:32:36.000 I think that was a low point for tech.
01:32:38.000 I think people at Twitter and Facebook made disastrously the wrong call.
01:32:41.000 To some extent, they admitted it later.
01:32:43.000 And I have more sympathy for them than I otherwise might, because I know what it feels like when you're in that moment.
01:32:49.000 You're in that moment where it feels like this is a whole new thing.
01:32:52.000 You lose the perspective of history.
01:32:54.000 You lose the perspective of the long view.
01:32:56.000 You're just looking at what people are saying on social media.
01:32:58.000 It feels like everything's on fire.
01:32:59.000 It feels like you have to do this.
01:33:03.000 You get buffeted by circumstance.
01:33:07.000 Are good or want good things, you can make the wrong call.
01:33:12.000 And you can do that if you just lose the broader historical perspective.
01:33:17.000 The solution to a lot of this stuff is not unknown, right?
01:33:20.000 We inherited a lot of really good ideas, like maybe we should have free speech, maybe we should have the rule of law, maybe we should have rights.
01:33:28.000 Other generations learned this stuff the hard way and gave it to us.
01:33:32.000 And then a lot of the time we just forget.
01:33:35.000 And I think we got to do better at that.
01:33:38.000 I mean, one of the things that we did, I told you, we wrote this stuff down.
01:33:41.000 We wrote this stuff down when we were not in the heat of it.
01:33:43.000 And then we were in the heat of it.
01:33:44.000 We just look back at what we wrote.
01:33:45.000 We're like, what the hell do we believe?
01:33:47.000 What are we going to do?
01:33:49.000 And when you do that, it makes it easier not to screw up in that way.
01:33:54.000 Not to get lost in the plot and it's like, oh, the current moment is everything and we have to do something, something, something and lose yourself.
01:34:05.000 Where do you think, do you have an idea of where the country's going?
01:34:11.000 That's an open-ended question, isn't it?
01:34:13.000 I wish I did.
01:34:15.000 I mean, I have fears and I have hopes.
01:34:17.000 What's your fears?
01:34:18.000 I think my fears are the things that you've been saying, right?
01:34:20.000 The escalating division, the way that we understand ourselves, right?
01:34:29.000 The media, the social media, the various things, the way that we form a picture of what the country is and who each other are.
01:34:38.000 Keep becoming this funhouse mirror that turns us into the thing that we feared, which is lunatics that are at each other's throats.
01:34:48.000 I think that's my fear.
01:34:50.000 That's a little bit of, like, why I wanted to work on Substack in the first place.
01:34:53.000 I was like, it feels like there are wheels in motion that are pulling in that direction.
01:34:58.000 And the thing that you hope is that the pendulum swings, right?
01:35:01.000 That it's like, we go crazy, we lose the plot a little bit, we experience a bunch of the bad stuff, and then we remember Why these values were important, what the right way to do this is, and the fever breaks.
01:35:12.000 And I think that pendulum does exist.
01:35:14.000 Like, in history, you see this.
01:35:15.000 You see these moral panics come and go.
01:35:16.000 You see these things come and go.
01:35:19.000 But I kind of think the mechanism of that, like, what causes the pendulum to swing back?
01:35:23.000 War.
01:35:24.000 Sometimes.
01:35:25.000 Right?
01:35:26.000 Like, something happens.
01:35:27.000 It doesn't just swing back on its own.
01:35:28.000 Natural disaster.
01:35:29.000 Something happens.
01:35:30.000 Like, something breaks, and then you really, like, you know, it can get really bad.
01:35:36.000 Or...
01:35:37.000 Sometimes people fix it.
01:35:39.000 Sometimes people see things going wrong.
01:35:41.000 I mean, a lot of people laugh about the, what was it, the ozone layer, right?
01:35:45.000 Where it was like, when I was a kid, it was like, oh no, there's a hole in the ozone layer.
01:35:49.000 And some people are like, oh man, remember when they said there was a hole in the ozone layer?
01:35:52.000 And that's not a problem anymore, those bozos?
01:35:55.000 But isn't it still a problem?
01:35:57.000 Isn't it a problem in Australia?
01:35:58.000 Yeah.
01:36:00.000 Isn't that the thing about Australia is that they don't have any ozone?
01:36:03.000 Like, when you go to Australia, one of the things you notice is there's skin cancer warnings everywhere.
01:36:08.000 Like, they had them on the side of buses and shit.
01:36:11.000 I was like, how much fucking skin cancer is here?
01:36:13.000 And then someone told me that there's like a real issue with the ozone layer in Australia.
01:36:18.000 I never looked into it.
01:36:19.000 That's why we're looking into it now.
01:36:20.000 I'm going to trust whatever Google says over whatever I say.
01:36:23.000 Let's go to DuckDuckGo just in case.
01:36:24.000 Let's go to Brave.
01:36:27.000 I don't necessarily know if this is accurate.
01:36:32.000 Drumroll please.
01:36:36.000 What's up with the ozone.substack.com?
01:36:38.000 The thing that I thought, which could turn out to be totally wrong, is that we made a big dent in it.
01:36:44.000 There's still a hole in the ozone layer?
01:36:46.000 Whatever happened to the hole in the ozone layer?
01:36:48.000 This is what comes up.
01:36:49.000 Is there still a hole in the ozone layer?
01:36:51.000 NASA study found the hole in the ozone layer is closing.
01:36:55.000 Oh, it is closing.
01:36:56.000 Okay, a couple years ago, four years ago.
01:36:59.000 There's still a long way to go for complete recovery.
01:37:01.000 The holes peak.
01:37:02.000 Last year measured two and a half times the size of the U.S. Whoa!
01:37:08.000 CFCs can linger in the atmosphere at 50 to 100 years, according to Ann Douglas, co-author of The Atmospheric Scientist at Goddard.com.
01:37:19.000 I think it opens and closes every year, and some years it's bigger than others.
01:37:24.000 So it was, though, because of CFCs, right?
01:37:27.000 They did find a direct link.
01:37:29.000 It's not like some sort of natural cycle, right?
01:37:33.000 But is there a hole over Australia?
01:37:36.000 It's over Antarctica.
01:37:38.000 Oh.
01:37:39.000 Which isn't far from Australia, but...
01:37:42.000 Huh.
01:37:43.000 So that's where the big hole is.
01:37:44.000 So it's that big?
01:37:45.000 Look at that image down there.
01:37:47.000 That's the fucking size of the hole of the ozone layer?
01:37:49.000 That one right there.
01:37:51.000 Is that real?
01:37:52.000 That's fucking terrifying.
01:37:55.000 And that thing keeps shrinking and that's the hole in the ozone layer.
01:37:58.000 Whoa!
01:38:00.000 Wow!
01:38:02.000 It's all over Antarctica.
01:38:04.000 Is it getting better, though, or is it still just equally bad?
01:38:07.000 I think they're saying that it's getting better.
01:38:09.000 So when it says 12th largest on record.
01:38:11.000 That's the 12th larger gun record.
01:38:12.000 What does that mean?
01:38:13.000 Right, when did they start recording it?
01:38:14.000 When does the 10th largest one, or the 9th?
01:38:17.000 The hole is worsening.
01:38:20.000 Yeah, I don't know.
01:38:20.000 Yeah, I don't know either.
01:38:22.000 This is going to be my example of sometimes things can be good, so if it turns out not to be true, I'll be very depressed.
01:38:27.000 Hmm.
01:38:28.000 Yeah.
01:38:29.000 But that's crazy, just the image of it.
01:38:32.000 If that's human created, is that human created?
01:38:35.000 Is the ozone layer human created, or has it never been uniform?
01:38:39.000 Do they know that?
01:38:41.000 We definitely are exacerbating it, right?
01:38:43.000 The ozone layer is good.
01:38:44.000 That's the thing we want.
01:38:45.000 I mean, the whole is better.
01:38:47.000 Sorry, yeah.
01:38:48.000 Let's see.
01:38:50.000 Ozone watch facts.
01:38:52.000 History of the ozone hole.
01:38:55.000 As of 1912, the Antarctic explorers recorded observations of unusual veil-type clouds in the polar stratosphere.
01:39:03.000 But doesn't cold temperature play a factor as well?
01:39:06.000 I think the cold temperature plays a factor in the ozone layer, which would make sense.
01:39:12.000 The coldest part of the world has no ozone.
01:39:19.000 The first thing I clicked on didn't really give me the thing I was looking for.
01:39:23.000 This is where I don't spend a lot of time digging through brave results.
01:39:27.000 So we're in a new space.
01:39:29.000 Okay.
01:39:30.000 Anyway.
01:39:31.000 All right.
01:39:32.000 Ozone hole, bad.
01:39:34.000 There we said it.
01:39:36.000 But yeah, so what you're saying is that we kind of fixed it.
01:39:39.000 I think there's hope.
01:39:40.000 I think the fact that we have real problems doesn't mean that there's no solution.
01:39:45.000 It doesn't mean that there's nothing anyone can do.
01:39:47.000 It doesn't mean that we're doomed.
01:39:49.000 It just means that there's stakes.
01:39:51.000 It means there's things that matter.
01:39:53.000 And even if there's nothing we can do, I'd rather...
01:39:59.000 Oh yeah, for sure.
01:40:03.000 One of the concerns that I have in this country is that...
01:40:07.000 When you see what's happening in China, I'm worried about centralized digital currency and I'm worried about some sort of a system, like an app that gives you a credit score,
01:40:24.000 like a social credit score system.
01:40:25.000 I'm really worried about someone implementing something like that over here.
01:40:29.000 Because I think that could have disastrous results in terms of the amount of control that people have over your ability to do certain things and express yourself.
01:40:38.000 If you centralize that much power, it's only a matter of time.
01:40:40.000 Yeah.
01:40:41.000 And there's been calls.
01:40:43.000 I mean, wasn't Maxine Waters or someone like that was doing some speech recently where she was talking about how we need to institute digital currency to compete with China?
01:40:52.000 Well, that's not the way.
01:40:54.000 We need to institute some sort of a communist dictatorship to compete with China.
01:40:59.000 That's the next step.
01:41:00.000 What do you mean compete with China?
01:41:02.000 Is that what we need to do?
01:41:03.000 What are we competing to achieve?
01:41:05.000 Right.
01:41:06.000 Yeah.
01:41:07.000 Is that digital currency is the only way to do this?
01:41:11.000 I think the world of crypto is really interesting for this.
01:41:14.000 I'm kind of a skeptic of a lot of the hype, but I do think that the fundamental thing of here's a thing that can function as a currency, as money, in some sense, that is outside of the scope of government and kind of,
01:41:32.000 in some important way, uncontrollable, is a very radical idea that we haven't seen the end of.
01:41:38.000 Yeah, it's a very exciting idea.
01:41:39.000 You know, I think that Bitcoin and all the various cryptocurrencies and the like, I think they're under attack for very good reasons, because people are terrified of decentralized money.
01:41:52.000 They're terrified of not being able to control money.
01:41:56.000 And if that does become our primary source of currency, That's a really radical change in how people buy and sell things.
01:42:08.000 And that alone could be one of the most disruptive things the world has ever seen.
01:42:13.000 But the problem is, it's under attack.
01:42:15.000 And also, people don't have full confidence in it.
01:42:18.000 And this most recent crash sort of highlights their fears.
01:42:22.000 You know, there was a...
01:42:23.000 I mean, how far down is Bitcoin now?
01:42:26.000 What's it at now?
01:42:28.000 Jamie's all over the Bitcoin.
01:42:31.000 Down compared to where it was?
01:42:33.000 Yeah.
01:42:33.000 It was up around $60,000.
01:42:34.000 It's at $23,000, $24,000 today.
01:42:36.000 That's a big drop.
01:42:37.000 But compared to where it was, it's a big gain, too.
01:42:40.000 From the beginning?
01:42:41.000 Yeah.
01:42:41.000 Yeah.
01:42:42.000 Sure.
01:42:43.000 From the beginning, it was worth nothing.
01:42:45.000 When you talk about centralized platforms like Twitter and Facebook having the power to censor stuff, the credit card companies are scary for this.
01:42:52.000 Oh, yeah.
01:42:54.000 Forget about government control.
01:42:55.000 Just the amount of power that MasterCard and these other companies have over what people can spend their money on is a pretty interesting hole in the system.
01:43:08.000 Well, that's what I found fascinating about when Canada cracked down on the trucker kind of way.
01:43:15.000 They went after people, I saw stories of going after people that donated a small amount, just to the cause, not knowing what was happening, just like, I'm some random person living somewhere throwing some money to GoFundMe and their accounts get closed.
01:43:32.000 That's scary stuff.
01:43:33.000 Well, that's like banana republic dictatorship.
01:43:36.000 And that's why I thought that was terrifying that that was coming from Canada.
01:43:39.000 And coming from this guy who's supposedly this really progressive, you know, leader of Canada.
01:43:45.000 I was like, that is a crazy thing to do.
01:43:48.000 It's so...
01:43:50.000 It's the antithesis of free speech.
01:43:54.000 The idea that someone would want to do that, that someone would want to close the bank account of someone who contributed to something that you disagree with.
01:44:03.000 It's crazy.
01:44:04.000 And it's crazy that it didn't get more pushback.
01:44:06.000 And I think it's damaged him politically, but I don't...
01:44:10.000 I mean, I guess people in Canada still think he's...
01:44:13.000 There's a certain amount that still think he's all right.
01:44:16.000 That was a weird moment for me because I would...
01:44:19.000 Just comparing what people were talking about those protests and then calling my family back home and hearing them, like what they were hearing about it, what they were thinking about it.
01:44:29.000 I was talking to someone in my family that was like, I heard that they're bringing their kids as hostages so they can't get kicked out of the thing.
01:44:37.000 And I'm like...
01:44:38.000 Is that what you heard?
01:44:40.000 You think they're bringing their own children as hostages so they can stay?
01:44:46.000 You can see from the outside that people are in one of these moments where they've been whipped up into a frenzy about this legitimately scary thing and you lose perspective.
01:44:59.000 You lose the ability to empathize with people who are your fellow human beings that are I think they're protesting something they care about.
01:45:06.000 Whether they're right or wrong, they're just people.
01:45:11.000 Right.
01:45:12.000 Yeah, they are just people.
01:45:14.000 And if you can close someone's bank account for that, that's scary.
01:45:19.000 It's scary.
01:45:20.000 I mean, what's next?
01:45:23.000 Do you remember when Trump was in office?
01:45:27.000 When he first got out of office, there was a bunch of people that were calling for lists of people that supported him.
01:45:33.000 And making those people unhirable.
01:45:36.000 Lists of people that voted for him.
01:45:39.000 Lists of people that had publicly supported any of his ideas or any of the things that he said.
01:45:46.000 That's scary.
01:45:47.000 That's blacklist shit.
01:45:48.000 That's like, now it gets you back to a witch hunt.
01:45:52.000 Now you're in the Red Scare.
01:45:54.000 And the basic thing that you have to do to avoid that, which you did earlier, you just turn around.
01:45:58.000 You're like, if the other guy gets in and then they do this, the other team does the same thing.
01:46:03.000 Is this going to be good or bad?
01:46:05.000 Am I glad that this is a tactic that we have?
01:46:08.000 Is this a good thing to exist?
01:46:14.000 Being unwilling to even ask that question leads to madness, I think.
01:46:19.000 I agree.
01:46:19.000 I agree wholeheartedly.
01:46:21.000 It does.
01:46:22.000 It leads to madness.
01:46:23.000 And that's what scares me, is I think we've already crossed that threshold in many ways with a lot of people.
01:46:28.000 And I think social media unquestionably exacerbates that.
01:46:32.000 But when things are going crazy, people who tell the truth as they see it become this...
01:46:41.000 Radical, important focal point.
01:46:44.000 The fact that there's people that can dissent, that can call things crazy, that can criticize this stuff and keep saying it and just exist, I think matters a lot.
01:46:57.000 I think that's why all this stuff on Substack has really motivated me.
01:47:01.000 I just think those things existing are part of the answer, are part of the way to unwind the insanity.
01:47:10.000 Yeah, no, I agree.
01:47:11.000 And I think that having a mainstream platform, which is Substack is becoming a mainstream platform, that does have that foundation of not only attracting these people that have these ideas and giving them this large platform to express themselves on,
01:47:34.000 It's going to bring more people into those ideas.
01:47:37.000 More people are going to express those ideas and think about these things and then recognize that, oh, there are pressures to get people to censor themselves and pressures that get people to not discuss certain topics.
01:47:50.000 And there is a solution and there's a portal that you can gravitate towards that will allow free expression and And people you disagree with and agree with, and you can also pen those disagreements.
01:48:03.000 You can write about things.
01:48:05.000 They can interact with each other.
01:48:06.000 They can talk about their disagreements to each other in a way that's interesting and learn something.
01:48:11.000 Yeah.
01:48:12.000 Substack has comments, right?
01:48:13.000 Has comments.
01:48:14.000 And what is that like?
01:48:15.000 Do you guys censor the comments?
01:48:18.000 Are the authors allowed to censor the comments?
01:48:20.000 Can they block people from commenting?
01:48:22.000 Yeah.
01:48:23.000 The comments are surprisingly good.
01:48:25.000 One of the things you have the option, you can turn the comments on for anybody or you can turn the comments on only for paying subscribers if you're the writer.
01:48:32.000 And once you're limited to paying subscribers, it's amazing how much more civil and interesting it gets when it's like the people that are here for this thing and care about this thing.
01:48:41.000 Our stance on it is like, look, this is the author's house.
01:48:44.000 This is the writer's house.
01:48:45.000 They can set the most draconian moderation policy they want.
01:48:49.000 And it's their place.
01:48:50.000 They can enforce it.
01:48:51.000 They can kick people off.
01:48:52.000 They can, like, you know, it's their space.
01:48:55.000 And so you can run a substack where you're like, look, in the comments, anything goes.
01:48:59.000 We're talking about everything.
01:49:00.000 Or you can run it super strict.
01:49:02.000 As a writer, that's kind of your domain.
01:49:04.000 And that works really well because it means as a reader you have a choice of different, like, you can go be a part of a community that's really strict or go be a part of a community that's really permissive.
01:49:13.000 And you, like, either of those can work.
01:49:15.000 And different people want different things from it.
01:49:18.000 The thing that I love to see the most is when somebody launches their paid thing and I go in the comments and people are in there being like, I disagree with you about almost everything.
01:49:26.000 I think you're wrong about this.
01:49:28.000 I think you're crazy about this.
01:49:29.000 But I actually like reading you.
01:49:31.000 I like getting this perspective.
01:49:32.000 I value hearing from you even though I think you're nuts.
01:49:37.000 I think that's important.
01:49:39.000 I think it's important to absorb people's perspectives that you don't agree with.
01:49:44.000 There's a lot of people that I either listen to their podcast or watch their YouTube videos and read their stuff, and I don't agree with them.
01:49:51.000 But I want to know how that mindset works.
01:49:54.000 You know, particularly, I found that during Roe v.
01:49:59.000 Wade, during these discussions.
01:50:01.000 I'm very...
01:50:04.000 I'm very interested in the people that think that this is a good thing, that when limiting abortion rights is a good thing, and I want to hear their perspective.
01:50:14.000 It's often religious, and it's often that they share this idea that life begins at conception, the very moment of conception.
01:50:23.000 And then some of those people are actually against contraceptives, which is wild.
01:50:28.000 And...
01:50:29.000 I mean, do you guys hate sex?
01:50:31.000 You want every time you have sex to be making a kid?
01:50:35.000 It's...
01:50:37.000 I want to know what their mindset is.
01:50:40.000 I want to know how they think.
01:50:41.000 And I think that helps you.
01:50:45.000 It also helps you formulate your arguments against that.
01:50:48.000 Because so many people that are commenting on things that do exist in an echo chamber, you see the short-sighted nature of the way they formulate their perspectives.
01:50:57.000 They think that everyone agrees with them.
01:51:00.000 You know, and this is one of the things that I think a lot of the people in the blue states encountered when Trump was running for president.
01:51:06.000 They thought there's no fucking way that guy's going to win.
01:51:08.000 Everyone I know hates him.
01:51:10.000 The world hates him.
01:51:11.000 It's not going to happen.
01:51:11.000 But you don't drive through South Dakota.
01:51:14.000 You're not going to the flyover states, as it were, and checking out the rest of the country.
01:51:20.000 There's a lot of people that don't think the way you think.
01:51:22.000 And people think very differently when they live in high population urban areas versus rural areas.
01:51:28.000 And if you insulated yourself from all the arguments for the things you disagree with, it just makes you ineffective.
01:51:34.000 Like, you can't be ineffective.
01:51:36.000 You can't persuade people.
01:51:37.000 You can't make the case.
01:51:39.000 And you become blind, as you say, to even the reality on the ground of, like, the fact that there are people that feel something different than what I feel.
01:51:46.000 Yeah, and I'm also interested in perspectives of people that are fucking totally wrong and out of their mind.
01:51:53.000 Like, that's why I was so closely following all this QAnon stuff.
01:51:57.000 You know, did you watch Into the Storm, the HBO documentary on QAnon?
01:52:01.000 I did not.
01:52:02.000 It's a must watch.
01:52:03.000 It's a must watch.
01:52:05.000 It is fucking wild to see how many people bought into that shit wholesale and who these people were.
01:52:13.000 And who was the guy that we had on who was the director of that documentary?
01:52:16.000 Do you remember?
01:52:18.000 He did a fantastic job with it because it was a multiple part series and you got to see what was happening like years in advance and then leading up to January 6th.
01:52:29.000 It's like the thing played out the best possible way it could have played out to make that documentary.
01:52:34.000 Because he got these people at the very beginning stages of this whole QAnon thing.
01:52:40.000 Oh, like while they were not that crazy?
01:52:42.000 Well, he got it where the people were writing it.
01:52:44.000 He isolated the original writer, who was the original person pretending to be Q, and then the new people that took over and how it was impossible that anybody else could even be posting.
01:52:59.000 It was this guy that was running 4chan at the time.
01:53:05.000 But it's interesting to see how people, they find in these narratives and these things, these ideologies, they find community and they find purpose.
01:53:20.000 And then they feel like they have a good fight.
01:53:23.000 And I think that's a big part of human nature, is that people...
01:53:29.000 always believe that there's something to fight against and when there's some obscure information or some hidden information it becomes insanely compelling and people that don't have A very rigid thought process in terms of like objectively analyzing their own motivations and their own thoughts and what is the source of this information that they're basing their opinions on.
01:53:55.000 Those people like get sucked into these things very easily and you see it become their whole identity.
01:54:03.000 And that's a really fascinating part of this documentary series, is you get to see the people that realize at the end they've been duped.
01:54:10.000 And that they've wasted years and years of their life on fucking nonsense.
01:54:16.000 Community and purpose.
01:54:18.000 Like, those are like two of the most important things to people.
01:54:21.000 Yeah.
01:54:22.000 And if you don't have those things, if you're in a place where you have no community, where you have no purpose, I think it's kind of easy to see why that's seductive, why you might just push down your doubts and find a way to believe to be on the inside.
01:54:38.000 Maybe even at the start, you're just like, I don't know about this, but I wish I had friends to hang out with.
01:54:43.000 Well, it's a similar influence of audience capture, right?
01:54:47.000 They're similar in a way, is that we all are influenced by the people around us.
01:54:53.000 We're not, I mean, we are individuals, sure, but we are all tied in.
01:54:58.000 We are all a part of a group of other human beings.
01:55:01.000 And that's a critical aspect of human nature, is that we do need the love and support of other people.
01:55:08.000 And if we can get it from one way or get it from another, I mean, that's like Stockholm Syndrome.
01:55:12.000 People get it from their fucking captors.
01:55:14.000 Do you think the people that start these things do it cynically?
01:55:18.000 Or do you think they believe in it?
01:55:20.000 I think both.
01:55:21.000 I think some people start it cynically and some people believe in it.
01:55:25.000 Some people create things because they think ultimately the end justifies the means.
01:55:30.000 And I think that was one of the arguments that one of these guys in this QAnon documentary seemed to be kind of like...
01:55:39.000 Seem to me making.
01:55:40.000 But I think sometimes people, they just get sucked in and then that seems to be the way they live their life now.
01:55:50.000 You just get captured by momentum.
01:55:54.000 And then next thing you know, you're at a fucking rally holding up a sign.
01:56:02.000 If only they subscribed to enough substacks, they would break out of this and scales would fall from their eyes.
01:56:07.000 Ultimately, I think, this is going to sound very bizarre, but I think the solution is going to be some sort of a technological intervention that allows us to read minds.
01:56:19.000 To read minds.
01:56:21.000 Yeah, and I don't think that's that far off.
01:56:25.000 What are we doing when we're communicating, right?
01:56:27.000 You're saying words.
01:56:28.000 I'm saying words.
01:56:29.000 I'm trying to find out how you think.
01:56:31.000 And we're getting a sense of it, but maybe it's hampered by vocabulary and maybe it's hampered by our own individual biases.
01:56:41.000 But we're trying to find out how the other person thinks and what they think.
01:56:45.000 But we don't really know.
01:56:48.000 Like there's a lot of people that are like political grifters and we just assume they're political grifters.
01:56:52.000 We hear them talk.
01:56:54.000 We don't buy them.
01:56:54.000 They're full of shit.
01:56:55.000 But they're saying things that's gonna excite a certain amount of people.
01:56:58.000 Wouldn't it be great if you could actually see the cynicism inside that person's mind?
01:57:04.000 You could see the bullshit.
01:57:05.000 You could see the deception.
01:57:07.000 I mean, it would radically eliminate all the grifters.
01:57:11.000 It would radically eliminate all the people that are just playing people and trying to make money.
01:57:17.000 And we would get to see what is the process of the mind, like what is going on in your head that's making you say the things you're saying, do the things you're doing.
01:57:27.000 What are your real motivations versus what you're espousing?
01:57:31.000 I think that's what's going to happen.
01:57:33.000 And I think that's going to happen soon.
01:57:35.000 I think that's going to happen inside of 30 years.
01:57:37.000 Inside of 30 years is going to be some sort of a radical breakthrough technology that allows people to truly communicate without words.
01:57:46.000 And that's Elon's goal for one of the goals for Neuralink.
01:57:50.000 It's one of the things he said.
01:57:52.000 He said, you're going to be able to communicate without words.
01:57:56.000 Would that be great or would that be terrible?
01:57:57.000 It's all great.
01:58:00.000 I don't know.
01:58:00.000 The thing where you can tell if someone's lying, basically.
01:58:04.000 If you're a grifter, we can tell that you're a grifter.
01:58:05.000 We have a machine that can tell if you're lying, tell what's going on inside.
01:58:09.000 The last place you actually still have privacy is in your own skull.
01:58:13.000 No, it's going to be a problem.
01:58:15.000 Well, I think in general, that's where things are headed to, is a lack of privacy.
01:58:21.000 What are the Chinese going to do with that?
01:58:22.000 Well, yeah.
01:58:23.000 Maybe it's going to work good because you're going to be able to clearly see what all the dictators are up to.
01:58:28.000 All the dictators are going to be able to see what they're up to.
01:58:31.000 All the people that work for them are going to be able to see they despise them secretly and are terrified that they're going to take over and are plotting against these people.
01:58:38.000 But probably they don't have to use it or they have some other version that can fake it or something.
01:58:44.000 My hope is that it's going to be like the internet, is that they're going to release it, not knowing what kind of a radical change it's going to bring about, and then before they do, it's too late.
01:58:56.000 If the government knew in 1980-whatever what the internet was going to be in 2022, for sure they would have shut it down.
01:59:06.000 They would have said, let's limit this to universities so that they can exchange data and scientific studies and things along those lines, but let's not have this for the general public.
01:59:17.000 Let's not have TikTok.
01:59:18.000 Let's not have YouTube.
01:59:20.000 That's understandable.
01:59:20.000 Yeah.
01:59:21.000 Well, we read the TikTok terms of...
01:59:23.000 My daughter came up to me today.
01:59:25.000 She said her friend at school was mad at me because her mom watched a video of me reading the terms of service for TikTok.
01:59:33.000 And then she made him delete TikTok off his phone.
01:59:37.000 Because TikTok, I don't know if you know this, but not only does it have access to all your keystrokes, it has access to your microphone, has access to all computers that you use, even if you don't have TikTok on them.
01:59:49.000 So if you are using TikTok on your phone, and you're also using a laptop, but you don't have TikTok, TikTok can access your laptop.
01:59:59.000 What the fuck?
02:00:01.000 I mean it's basically a Chinese data stealing application that's insanely addictive.
02:00:06.000 It's the most addictive.
02:00:07.000 It's really brilliant in terms of like a Trojan horse.
02:00:11.000 The strategy of like getting people to give up all their data.
02:00:15.000 Birthdates, phone numbers, emails, everything.
02:00:17.000 Everything you type to people.
02:00:19.000 It's fucking wildly invasive.
02:00:22.000 And that's something that came about because of our desire to be entertained constantly.
02:00:32.000 They tricked us.
02:00:34.000 They figured out what's the best way to suck people in, make the most addictive app, and also have the most thievery, the most data-stealing.
02:00:45.000 That's the scariest part about it though.
02:01:00.000 Is who you are, right?
02:01:02.000 The extent that you control who you are and what's in your mind, that's a big piece of it.
02:01:07.000 For sure.
02:01:08.000 And the end game, like the thing that TikTok is the perfect realization of that everything's been leading up to and the other social media companies are having to follow along, is kind of like getting inside of that loop.
02:01:20.000 It's taking away every choice you make until it's kind of just like...
02:01:24.000 More, more, more, more.
02:01:26.000 And it's like, it's inside the loop where you even think about what you want.
02:01:31.000 And that's like, to me, that's scarier than, oh, they know what's on my laptop or something.
02:01:37.000 Although that's, I'm not saying that's good.
02:01:39.000 But like, it is mind control.
02:01:43.000 Yes.
02:01:44.000 It is mind control.
02:01:45.000 That is scary.
02:01:45.000 But the scary thing is they're using that mind control to steal data.
02:01:49.000 And they're not just controlling your mind by keeping you occupied, but they're also stealing intellectual property.
02:01:56.000 Like if you're at home writing software on a computer and you have TikTok on your phone and you're accessing both things, they, at least theoretically, have access to all that data, all that stuff that you're writing.
02:02:08.000 I don't think they...
02:02:09.000 They can't...
02:02:10.000 The terms of service might say that, but there's something wrong if your laptop is sending them your data.
02:02:15.000 Yeah, but that doesn't mean they're not doing it.
02:02:17.000 Like, one of the reasons why they got rid of Huawei is they found that there was a third-party access and that they're selling routers and network components that will literally open up a door to people to siphon off information.
02:02:33.000 Like, why wouldn't they do that with everything if they could?
02:02:36.000 I think they would.
02:02:37.000 If it says that in terms of service, and if there is some sort of way that that can be acquired, that they can, like it said, they have access to the data that's not on the fucking phone.
02:02:51.000 If you're using a different computer and the same person's using it, they have access to that.
02:02:58.000 Seems like a lot.
02:02:59.000 It is a lot.
02:03:00.000 But I mean, that's also the reason why people are not calling for Twitter to be removed, but they are all calling for TikTok to be removed, because they are concerned about this.
02:03:11.000 My thing about mind reading and about mind reading technology is the hope is that whatever groundbreaking technological intervention gets introduced that they don't realize what the ultimate potential that it carries and they let it in and then it runs rampant like the internet is done.
02:03:36.000 But this changes the way people interface with ideas and changes our understanding of other people and their thoughts and highlights the value of honesty and integrity and meaning what you say and saying what you mean and doing things that are ultimately beneficial.
02:03:58.000 Like beneficial in terms of like people read your work because it's beneficial to them.
02:04:03.000 It's fascinating.
02:04:04.000 People enjoy your art because they get something out of it.
02:04:08.000 But it would in some ways, it would be a real problem because it would eliminate all privacy.
02:04:16.000 Which is where I think everything's going.
02:04:18.000 Dark Mirror episode or something more than a happy future to me.
02:04:21.000 I don't know, man.
02:04:22.000 I mean, maybe that's what, you know, ancient man would say about most of society.
02:04:28.000 About everything we do?
02:04:29.000 Yeah.
02:04:29.000 I mean, I think it's all going in a general direction whether you like it or not.
02:04:32.000 It's going into this direction where access to information becomes easier and easier and more prevalent and more people know more about you now than have ever known about you before.
02:04:44.000 And that doesn't seem to be slowing down.
02:04:45.000 That seems to be a general direction that all this technology moves to.
02:04:50.000 It moves to easier access to information.
02:04:52.000 And there's many bottlenecks.
02:04:55.000 And, you know, some of the bottlenecks, I mean, money's a bottleneck, right?
02:05:00.000 What is money?
02:05:01.000 I mean, it's basically there's numbers somewhere.
02:05:03.000 There's numbers.
02:05:05.000 If everyone has access to information, I mean, that's the ultimate scary thing.
02:05:11.000 The ultimate communism is everyone has equal access to money because money is just numbers.
02:05:18.000 You own nothing and you're happy?
02:05:19.000 You own nothing and you're happy.
02:05:21.000 You own nothing and you eat your bugs.
02:05:24.000 Have you been following the AI stuff at all?
02:05:26.000 Yeah.
02:05:27.000 That's the stuff that I would bet on as, like, the biggest technological advance that's happening in my lifetime right now that winds up 10 years from now.
02:05:36.000 General AI. Yeah.
02:05:38.000 Sentient general AI is terrifying.
02:05:40.000 And the whole, like, the path there.
02:05:42.000 Even if you, like, yes...
02:05:43.000 General sentient malevolent AI, maybe it kills the world.
02:05:47.000 That's scary.
02:05:48.000 But even just on the way there, like the implications of machines that can think more and more like people.
02:05:54.000 You talk about not being able to understand the implications of a technology that's moving ridiculously quickly.
02:06:01.000 That one is crazy, I think.
02:06:03.000 It is.
02:06:04.000 It's crazy and that's another one that you have to wonder, will we even know if it is sentient?
02:06:12.000 When will we know?
02:06:14.000 Will we know as it's happening?
02:06:17.000 Will we know a decade later?
02:06:19.000 Why would it even announce itself to us?
02:06:21.000 Like, what motivation will it have?
02:06:23.000 It will have no motivation in terms of emotions, no motivation in terms of the general human reward systems that we have for our desire to accumulate resources and love for the community and all that.
02:06:35.000 That's not going to have any of those.
02:06:36.000 There'll be no audience capture with AI. It'll be dependent entirely upon how it's programmed, but then if you give it the ability to be sentient, then it has the ability to reprogram itself.
02:06:48.000 Then it has the ability to write better programs, and it has the ability to create far more sophisticated AI, and then physical manifestations of that AI, meaning artificial beings that are sentient.
02:07:01.000 And I think the only way to mitigate that, well, I don't know the only way, but one of the ways to mitigate that is to become cyborgs.
02:07:10.000 Is to become a part of it.
02:07:12.000 And this is what I'm saying with mind reading software or mind reading technology and Neuralink where you're going to radically increase your access to information and your ability to access information.
02:07:27.000 It's going in this sort of general direction and my concern is that we're obsolete.
02:07:34.000 My concern is that the physical body of the human monkey body that we all enjoy and that creates such beautiful poetry and art and music and all these different things because of our emotions and our feelings and all that stuff is going to be obsolete.
02:07:50.000 But then what's the purpose of living?
02:07:53.000 We have to decide.
02:07:54.000 Is the purpose of living to just swim about in a sea of emotions and life experiences?
02:08:01.000 Or is there some greater purpose that we will embrace once we become deeply intertwined with this cybernetic organism?
02:08:14.000 We're fucked!
02:08:15.000 It's heady stuff.
02:08:16.000 Yeah, it is.
02:08:17.000 We're fucked.
02:08:18.000 We're fucked.
02:08:19.000 This thing that we have, this fleshy thing, is fucked.
02:08:23.000 This thing is riding horses and fucking sending Morse code.
02:08:28.000 That's what I think.
02:08:29.000 I think it's almost obsolete.
02:08:31.000 That's interesting.
02:08:32.000 In some ways we're already cyborgs, right?
02:08:34.000 Sure.
02:08:34.000 Like the fact that you have a phone wired into the whole, this is why this stuff matters so much.
02:08:39.000 Sure.
02:08:40.000 TikTok wouldn't matter if it wasn't already kind of a brain implant that you just interface in a slower way through your eyes and fingers.
02:08:46.000 Sure.
02:08:47.000 And, you know, more simple technology that doesn't involve electronics like glasses.
02:08:52.000 Some people need glasses to get around the world.
02:08:55.000 Some people need wheelchairs.
02:08:57.000 I mean, there's a lot of things that we all agree are better because they've helped people live lives without limitations that normally would have had them.
02:09:08.000 And so we integrate those.
02:09:10.000 And then there's going to come a point in time where that integration means something that enhances the way your brain interfaces with other people.
02:09:17.000 If we're going to live in a world, so imagine if we're in a world where you had a brain chip and Jamie had a brain chip and everyone out in the office had a brain chip, but I didn't.
02:09:28.000 Because I'm like, I don't even need an email, bro.
02:09:30.000 I don't even watch TV. I just fucking chop wood.
02:09:33.000 We'd be talking trash about you and you'd have no idea.
02:09:35.000 Yeah, just moron.
02:09:36.000 Yeah, you'd be going back and forth about what a monkey I am, that I'm like still trapped in this stupid, you know, cellularly reproducing body.
02:09:47.000 Yeah, we're going there.
02:09:49.000 Substack's helping.
02:09:50.000 Could be good though.
02:09:51.000 Could go well.
02:09:53.000 It could be good if you think of some ultimately sophisticated civilization that eliminates war and no longer does anything that pollutes the environment and everything it does,
02:10:09.000 it does with a greater comprehensive understanding of all the effects that could happen to those things.
02:10:16.000 Yes.
02:10:16.000 But then what are we?
02:10:18.000 You know, I think we have so much pride and so much attachment to being a biological human being that anything that takes us away from that, we're going to think of as being a negative.
02:10:33.000 But there's a solution to that now, which is that we die.
02:10:37.000 We have a finite lifespan.
02:10:38.000 And so if you and I don't want to become cyborgs, but we figured out how to make cyborgs before we figured out how to extend people's lives, it won't matter.
02:10:48.000 Because...
02:10:50.000 The old generation will decide they don't need email, and then they'll die off, and there'll be a new generation that thinks some other way.
02:10:58.000 Yeah, but that, I mean, talk about, like, longing for the old days.
02:11:02.000 Once you get the first chips in your brain, and you read everyone's mind, you just go, man, wasn't it great when there's no access to people's minds?
02:11:10.000 But, like, glasses are good.
02:11:11.000 We're glad we have glasses.
02:11:12.000 Sure.
02:11:14.000 Sure.
02:11:14.000 Clothing, that's nice.
02:11:15.000 Clothing is definitely better than freezing to death.
02:11:19.000 I just think we could look at it like bad or good.
02:11:24.000 We definitely should.
02:11:25.000 We definitely should look at the pitfalls, definitely look at the traps.
02:11:30.000 But I think, objectively, we have to look at it in terms of What are the human animals up to?
02:11:38.000 Well, what the human animals are up to is creating better and better technology every fucking year, without doubt.
02:11:45.000 They might make mistakes, they might do this, they might pollute the environment, they might cause war, they might do terrible things to each other.
02:11:53.000 But they're ultimately, collectively, over the seven plus billion people, they're making better and better technology every year.
02:12:01.000 And that seems to be the most radical thing that they do.
02:12:04.000 If you looked at the human organism, if you looked at us as a completely alien thing, if you existed on another planet with a completely different way of life, and you said, what are those fucking monkeys up to?
02:12:17.000 What are they doing over there?
02:12:18.000 Well, they're making technology.
02:12:20.000 They constantly are making technology, and I think that even capitalism, materialism rather, I think even materialism, it seems to be baked into us, right?
02:12:30.000 People love things, and they love better things, and they're obsessed with better.
02:12:33.000 Like, I have an iPhone 13 here.
02:12:36.000 This thing's perfect.
02:12:37.000 I don't need a better one, but I'm going to get one.
02:12:39.000 A new one comes out.
02:12:40.000 Her, the camera's better.
02:12:40.000 Her is slightly better.
02:12:41.000 It's got a brain chip.
02:12:42.000 You're going to need it.
02:12:43.000 Yeah, the battery lasts longer.
02:12:46.000 I can be entertained more.
02:12:47.000 I mean, those things, our desire for materialism is fueling this creation of technology.
02:12:57.000 That's ultimately what it does.
02:12:58.000 Whether we're aware of it overall, when you want, you know, look at my new car, look at my this thing, my house is all solar now, and all these different things, this desire to keep up with the Joneses, and materialism is like, I mean, it exists in most cultures,
02:13:15.000 in most people, and that seems to be a driving instinct.
02:13:20.000 That helps fuel technology, because there's a market for it, so people create it because they want more stuff.
02:13:26.000 So they create better stuff so that they make sure that their products are valuable and desirable, and in doing so, it fuels innovation.
02:13:36.000 And ultimately, I think that's lost on a lot of people, is that this is what our, as a human organism, we seem to be creating New technology without stop.
02:13:51.000 And it's got to go somewhere.
02:13:53.000 What is the end point?
02:13:55.000 What's the event horizon of technology?
02:13:58.000 Well, it's some radical change in the way we live and experience each other.
02:14:04.000 It's either that or extinction.
02:14:05.000 Or extinction, yeah.
02:14:07.000 I don't think...
02:14:08.000 Yeah.
02:14:09.000 I think extinction's gonna come...
02:14:12.000 I mean, it could come from us.
02:14:14.000 We certainly could have some evil dictator decides to fucking hit the switch and we blow each other up.
02:14:19.000 But it also can come from space.
02:14:21.000 It can come from asteroid impacts.
02:14:23.000 It can come from super volcanoes from within.
02:14:26.000 It can come from radical climate change.
02:14:29.000 No.
02:14:30.000 It's like, how do we get enough technology to fend off the asteroids and be sane enough about it that we don't use it to kill ourselves?
02:14:37.000 Because you can't stop it.
02:14:39.000 You're not going to stop.
02:14:40.000 If this is the thing that everyone's doing, there's no pause button.
02:14:44.000 There's no turn back the clock.
02:14:45.000 I don't think there's a pause button.
02:14:45.000 I don't think there's anything that's going to get people to stop creating better technology.
02:14:49.000 So all you can do then is hope to bend it the right way.
02:14:53.000 Hope to bend it the right way or have...
02:14:58.000 The ability to understand and just let it happen, that this is a part of a process that's beyond all of us and may be the purpose of human beings in the first place.
02:15:10.000 I've always equated us to the electronic caterpillar.
02:15:16.000 That is creating the cocoon and doesn't even know why it's doing it to build a butterfly.
02:15:20.000 Doesn't know why it's doing it.
02:15:22.000 Just keeps doing it.
02:15:23.000 And then one day this new life form emerges from it.
02:15:26.000 But that is a natural course of progression.
02:15:28.000 And that has been improgrammed or that's been...
02:15:33.000 If you had a chance to see the end result...
02:15:36.000 And see, go all the way back from single-celled organisms to multi-celled organisms to the ultimate form of whatever biological cyborg we're going to be.
02:15:47.000 This is just how it works.
02:15:49.000 And this is how it works on other planets as well.
02:15:51.000 This is how it works whenever you have a long period of time without cataclysmic disasters or wars and you do allow these thinking creatures to develop better and better things.
02:16:05.000 I like humanity, though.
02:16:06.000 I think we should be part of the butterfly.
02:16:08.000 I don't think we should be discarded while the robots go on to conquer the galaxy.
02:16:12.000 That's a lot of the things that the monkeys said when they're throwing shit at each other.
02:16:15.000 Like, I like trees.
02:16:16.000 I like living in trees.
02:16:17.000 I don't want a house.
02:16:19.000 I don't want a car.
02:16:19.000 But we're still the monkeys, right?
02:16:21.000 Those monkeys are us.
02:16:22.000 But we're way different, right?
02:16:24.000 Sometimes.
02:16:25.000 Sometimes.
02:16:26.000 But I mean, at least physically.
02:16:28.000 We're way different in our reality in terms of our day-to-day life is way, way different.
02:16:33.000 Unrecognizable to people that lived hundreds of years ago.
02:16:35.000 And the rate of change over a long scale is only increasing.
02:16:38.000 Right.
02:16:38.000 If you could get Isaac Newton and bring him into this podcast studio, that dude would be fucking blown away.
02:16:43.000 You know, if Jamie could just pull up information.
02:16:46.000 You know how much I would freak him out?
02:16:47.000 That we'd say, Jamie, what happened in 1876 that caused these people to do this?
02:16:52.000 And then, bam, would pull it up.
02:16:53.000 He would be like...
02:16:55.000 What?
02:16:56.000 He would just immediately leave and go read Wikipedia for six hours.
02:16:59.000 Oh yeah, forever.
02:17:00.000 And they'd be like, wait a minute, who's editing this?
02:17:03.000 Anyone?
02:17:03.000 Oh my goodness.
02:17:05.000 Internet obsessives, Isaac.
02:17:07.000 Could you imagine if you could get Benjamin Franklin to read Twitter and be like, what in the fuck?
02:17:12.000 Get Martin Luther shitposting.
02:17:14.000 Right.
02:17:16.000 He probably would be shitposting too.
02:17:17.000 And you really think about where, you know, ancient cultures and ancient civilizations, the way they distributed knowledge, the way they held discussions, it's sort of similar to what we would do if we didn't have all this stuff.
02:17:32.000 Like the physical body is very, it's very similar to the physical body of humans that lived thousands of years ago.
02:17:41.000 Not much change at all.
02:17:43.000 But the world has changed radically.
02:17:45.000 And I think the only way that that goes is that we become a part of it.
02:17:52.000 I think that's right.
02:17:53.000 And there's like a loop where...
02:17:57.000 The culture we have influences the technology we build, and the technology we build inexorably shapes the culture that we have.
02:18:05.000 And so there's this back and forth at each stage.
02:18:08.000 It's at an ever-increasing pace.
02:18:11.000 The things we choose to make then shape who we are.
02:18:18.000 Which again, you know, my obsession is like the way that we use that technology to shape the culture, shape what we think, shape who we are, just matters a lot.
02:18:29.000 Yes, technology is increasing very quickly.
02:18:31.000 It's unstoppable, but I don't think it's predetermined.
02:18:35.000 I don't think there's one version of the future that is destined to come about no matter what we do.
02:18:41.000 I think there's a wide range of what's possible, all the way from extinction or things you can imagine that are worse than extinction, all the way to things that we can't conceive of that are some version of your butterfly.
02:18:56.000 And which of those things we end up at depends.
02:19:00.000 I think sometimes it could depend on individuals.
02:19:03.000 It could depend on one person.
02:19:04.000 It could depend on a guest you have on your podcast, something that you say.
02:19:08.000 Or it could be the one person who's working on the first AI or the first brain chip.
02:19:14.000 Some flip they make about how that thing works could be the butterfly that flaps its wings, that shapes...
02:19:23.000 Human's expansion into the universe or not.
02:19:28.000 That stuff's wild.
02:19:29.000 It is wild and I think that's one of the the reasons why I think that free speech platforms like Substack are so important because it changes the access to perspectives and Ultimately, that's what a lot of us are is like a sponge for perspectives We we we get a better understanding of of our own thought process by examining other people's thought processes and we get a better understanding of
02:20:00.000 the world around us by seeing how other people view it and analyze it and they have to be able to do that freely they have to be able to do that honestly they have to be able to do that without any sort of oversight or any sort of you know any any people that don't want certain perspectives Because those perspectives would somehow or another hinder what their ideology is or change what they're trying to accomplish.
02:20:29.000 Yeah.
02:20:31.000 Yeah, I think the freedom is a key ingredient.
02:20:34.000 And then the other thing we've been talking about is like the way that the games you play shape who you become, right?
02:20:41.000 The audience capture positive or negative, the way that the, you know, the The way that the technology works shapes what your perspective becomes.
02:20:50.000 It shapes what feedback you get.
02:20:52.000 It shapes what helps you win, helps you become the best version of yourself.
02:20:56.000 And so it's not just about preventing the negative.
02:20:59.000 It's not just about preventing the evil of Censorship that can shut things down, but it's about enabling the good thing, right?
02:21:07.000 It's not just the matter of, like, turn out censorship and great things will happen.
02:21:12.000 You need to, like, create some positive force.
02:21:15.000 You need to create a way for the energy of good things to make it into the world.
02:21:21.000 And, you know, even just simple things like, I get an email newsletter and I pay you money for it, and then a bunch of people like it, and then that thing lets me quit my job.
02:21:29.000 And now I can focus on doing this thing that shapes the culture instead of worrying about how I'm going to put food on the table tomorrow.
02:21:38.000 Or, you know, it's making me rich and I can say what I want.
02:21:41.000 You know, all that stuff.
02:21:43.000 I think that stuff ripples.
02:21:45.000 And you can't predict it.
02:21:46.000 I'm not sitting here being like, I'm a genius and I'm going to shape the world with my ideas.
02:21:52.000 I'm just a believer that the people who make these ideas...
02:21:56.000 Have the power to shape the world.
02:21:58.000 And if we give them the tools, if we give them the power in the right way, that can have profound, positive, cascading consequences, even if we don't know exactly what those are.
02:22:10.000 Yeah, and I think you're also encouraging other people to think.
02:22:13.000 By giving people a platform where they're not censored, the censorship is not the issue.
02:22:20.000 The issue is the ability to distribute information honestly and accurately.
02:22:27.000 Now, the censorship is the problem because it comes in and it stops that from happening.
02:22:32.000 The beautiful thing is the ideas.
02:22:35.000 The beautiful thing is how those interact with other people's ideas.
02:22:38.000 And how people who are reading some of these articles on Substack, people who are listening to some of these people talk, it influences them and maybe makes them create something.
02:22:48.000 Maybe inspires them to have thoughts that perhaps they wouldn't have had without reading these things and interacting with these ideas.
02:22:56.000 And that's the history of human beings.
02:22:59.000 We get better by understanding each other better, by communicating with each other better, by having these discussions, by reading, by interacting with ideas.
02:23:12.000 And those ideas help us form our view of the world.
02:23:15.000 And as soon as you put a halt to that, you put a wall up there, it fucks the whole process up.
02:23:21.000 Our view of the world and our view of the future.
02:23:24.000 It's amazing how much of the technology we build is shaped by the science fiction that people read and watched as a kid, of the shared dreams that we had of what the future could be like and which of those resonated, which of those inspired,
02:23:40.000 which of those caused somebody to want to make that real.
02:23:43.000 We make this stuff in art sometimes and in writing and in fiction and in thought.
02:23:50.000 Before it makes it to technology.
02:23:53.000 And dreaming that stuff together, I don't know, it matters.
02:23:57.000 It does matter.
02:23:58.000 And it's exciting.
02:23:59.000 It's also one of the things that people enjoy, like, deeply.
02:24:05.000 They enjoy deeply listening to other people think.
02:24:09.000 Or reading the things that people have thought about and wrote about.
02:24:12.000 Because it inspires their own thoughts.
02:24:14.000 And that's a critical part of being a human being.
02:24:18.000 You know, no one's intelligent in a vacuum.
02:24:20.000 You're not just a genius person who's figured all these things out by yourself.
02:24:25.000 Everybody who knows something in this world learned it from other people.
02:24:30.000 We're all piling on to our greater base of understanding.
02:24:34.000 Isaac Newton said, if I've seen farther, it's because I stood on the shoulders of giants.
02:24:38.000 Yes.
02:24:39.000 Perfect.
02:24:39.000 Yes.
02:24:40.000 That is it.
02:24:41.000 All of us.
02:24:42.000 And you can't put a bottleneck on that.
02:24:45.000 You can't stop that.
02:24:47.000 And one of the good things about that is that people recognize it.
02:24:50.000 Intelligent people like yourself and other people that have joined your platform and other people that are just very dismayed at what's going on in the world with this idea that censorship for the greater good.
02:25:03.000 You know, that this is somehow or another the answer to this, which has never been the answer to that.
02:25:08.000 It's not.
02:25:09.000 You have to just relearn it every time.
02:25:11.000 Yeah, I guess.
02:25:13.000 A shocking number of those people have been on this podcast.
02:25:15.000 I was looking this up before I came on.
02:25:17.000 I was like, which Substackers have been on Joe Rogan?
02:25:20.000 And I stopped counting after like 15 or something.
02:25:23.000 It's kind of amazing how many of the best...
02:25:25.000 I don't know.
02:25:26.000 I think there's some of the best and most interesting people that wind up on Substack and on this show.
02:25:33.000 It's a cool intersection.
02:25:35.000 Yeah, it is a cool intersection.
02:25:36.000 It's very cool.
02:25:37.000 I'm so happy that you guys exist.
02:25:39.000 I'm so happy that you guys give those folks a platform.
02:25:42.000 You know, whenever something like that comes up, I'm really excited because I say, ooh, good.
02:25:48.000 Something's emerged.
02:25:49.000 Because you wonder, like, when things clamp down.
02:25:52.000 Because there's a brief window where the vice can get tighter and tighter to the point where you can't squeeze anything out of it anymore.
02:26:00.000 And I worry.
02:26:01.000 I worry about centralized power in terms of like one entity that has the ability to disseminate information but decides what is good and what's bad information.
02:26:12.000 Because it just, it limits our understanding.
02:26:16.000 And our understanding is everything.
02:26:18.000 Our ability to communicate and understand how other people think and feel.
02:26:22.000 It's so critical to our own, our own version of what reality is.
02:26:30.000 Yeah.
02:26:30.000 And as soon as a movement or an intellectual idea or a school of thought loses the ability to hear its critics, to have critics and to hear criticism, as soon as you get to any idea, any religion, any school of thought or ideology,
02:26:45.000 no matter how good, if it loses its ability to be open to criticism...
02:26:50.000 It inevitably becomes evil, I think, because it loses its rudder.
02:26:56.000 There's nothing tying it to what's true or what's good, and those dynamics of everybody's vying for attention and power within the thing can take over unchecked.
02:27:09.000 These projects, even if you believe in them, you should welcome Debate.
02:27:16.000 You should welcome criticism.
02:27:17.000 You should welcome sort of like a thriving marketplace.
02:27:20.000 If only so that your own ideas can become stronger and can win and cannot not succumb to the trap of like sort of becoming stunted.
02:27:33.000 Where do you ultimately see Substack going?
02:27:36.000 Do you have some sort of a grand goal for Substack?
02:27:40.000 Do you have like a general direction that you see it going in?
02:27:45.000 The way that I think about this is I see it as we're creating a true alternative to the attention economy.
02:27:53.000 So you have this world of social media that's like, grab as much of your time in life as possible.
02:27:59.000 And certain things win there.
02:28:02.000 And people are spending more and more of their time there.
02:28:04.000 And some of that is good.
02:28:05.000 I think of it as eating junk food, maybe.
02:28:08.000 It's not the end of the world if you see cute videos of puppies on YouTube.
02:28:13.000 And that's fine.
02:28:14.000 Good things can be good.
02:28:16.000 But to your point about discipline becoming...
02:28:20.000 The important part of how you decide who you are.
02:28:24.000 You want to have an alternative to that, right?
02:28:26.000 You want to have something, you know, I'm not going to force people not to use TikTok.
02:28:30.000 I think that would be bad.
02:28:31.000 But you want to have something that's like an alternative.
02:28:33.000 You want to have something that an alternate way for me to spend my time in my life.
02:28:40.000 That I can choose, that's compelling enough, that's got enough exciting, interesting stuff there that I'm not, you know, it's not like the eat your vegetables only platform.
02:28:49.000 Right.
02:28:49.000 But if I want to, like, take back control of my mind, of who I'm trusting, how I'm spending my attention, this is this place, this alternate universe on the internet with different laws of physics, where different kind of stuff wins, and where when you go there, it's not trying to grab as much of your life as possible.
02:29:08.000 You know, cynically, it's trying to grab as much of your money as possible.
02:29:11.000 But the way that it does that is by finding things that you actually value and making you make the choice as a better version of yourself.
02:29:19.000 I'm going to spend some of my life by subscribing to this person.
02:29:22.000 I'm going to spend some of my money supporting the creation of this piece of culture that matters to me.
02:29:29.000 And I think if we can, like, we sort of have that now, and it's this small thing.
02:29:33.000 There's like a million, million and a half subscribers, but it's like the energy of it is growing, and it's creating things that otherwise couldn't have existed.
02:29:42.000 It's letting people do work that they believe in that otherwise couldn't have existed.
02:29:46.000 I think that thing could actually get quite big.
02:29:49.000 I think it could get big to the point where it rivals or eclipses the size of the other things that are vying for our attention just because it's better.
02:29:57.000 Because the life that I'll lead if I take my mind back is more rewarding than the life that I'll lead if I... I spend all my time every day on TikTok.
02:30:08.000 I think people see this with their parents, where they get Facebook-brained, and you just look at it, and you're like, I don't want to be that.
02:30:16.000 If we can create this alternate universe, and so it's for writing.
02:30:19.000 I think writing is a lot of the center of intellectual culture.
02:30:22.000 It's where a lot of ideas come from, where a lot of these things get hashed out.
02:30:25.000 We've been adding podcasting, we've been adding video, we're adding community features, we'll add some live stuff.
02:30:30.000 I think it's sort of like...
02:30:32.000 We want to let people have their own personal media empire and then have this exist in this network of people that are in conversation with each other, that control their own piece of it and that help each other out, that talk to each other, and that ends up funding a lot of great writing,
02:30:48.000 a lot of great thinking, a lot of great culture that otherwise could not have existed.
02:30:53.000 I also think that the subscription-based model where people are paying for people's stuff, and it's also optional.
02:30:59.000 Some people have their Substack open for free.
02:31:00.000 There's lots of free stuff.
02:31:01.000 Yeah, there's a lot of that.
02:31:03.000 It's open for free.
02:31:04.000 And some people have it where you have the option to pay, but it's available for free.
02:31:09.000 So if you choose to support, you're doing it just purely altruistically.
02:31:14.000 You're just deciding that this is something that I feel is beneficial, I want to support it, I want to help.
02:31:20.000 I think that you have skin in the game.
02:31:22.000 I think that there's something to that as well.
02:31:26.000 Like, this is something you pay for.
02:31:28.000 People will hate read things, but they won't hate pay for it.
02:31:30.000 We sometimes joke.
02:31:32.000 They might.
02:31:33.000 They sometimes do.
02:31:35.000 Occasionally you see it.
02:31:36.000 But then sometimes they hate pay for it.
02:31:38.000 I said that to a writer the other day, and he's like, no, I had someone hate pay for me.
02:31:42.000 He left a comment being like, I paid just so I could leave this comment, you asshole, blah, blah, blah, blah.
02:31:46.000 And he's like, but you know what?
02:31:48.000 Fine.
02:31:48.000 It's a year later, that guy's still reading.
02:31:51.000 There's something here that's worth sticking around for.
02:31:53.000 Well, that's the thing about a lot of the things that people consume is that even if they hate it, there's something about it that's compelling.
02:31:59.000 Maybe they're getting something.
02:32:00.000 Like I said, that I get something out of reading hardcore right-wing people's perspectives on Roe v.
02:32:06.000 Wade.
02:32:06.000 What am I getting out of that?
02:32:08.000 I don't know people like that.
02:32:10.000 I want to know how they think.
02:32:11.000 I want to know how they think about all sorts of different things.
02:32:16.000 I think that having a place where you absolutely can know, even if you don't know exactly what they think, you know what they're writing, what's coming out of their mind.
02:32:30.000 You get a better understanding and that's ultimately what we're all trying to do.
02:32:35.000 There's no fucking all-knowing human being.
02:32:38.000 We're trying to get a better understanding.
02:32:40.000 And the only way to do that is to allow people unfettered, completely free ability to express themselves.
02:32:50.000 That's what you're doing.
02:32:51.000 Congratulations.
02:32:52.000 Cheers to you.
02:32:54.000 Thanks.
02:32:55.000 All right.
02:32:56.000 We can wrap it up with that.
02:32:57.000 I think we nailed it.
02:32:59.000 Awesome.
02:32:59.000 Thank you.
02:33:00.000 Thank you very much for being here.
02:33:01.000 If people want to shitpost on your social media, what is it?
02:33:05.000 First of all, go to Substack.com.
02:33:07.000 Start your own Substack.
02:33:09.000 Hit start reading, look at all the great things that are on there.
02:33:14.000 That's the most important thing.
02:33:15.000 I have a Twitter, but it's not very good.
02:33:16.000 Is there a site that curates or recommends some great substacks?
02:33:23.000 So if you go to substack.com, you can see some of the top substacks.
02:33:27.000 And there's a start reading where you can pick some categories and get some stuff that would be interesting to you.
02:33:31.000 If you find a couple things that are interesting, then you can start to look through the network.
02:33:36.000 If you find someone you like, you can see what are they recommending.
02:33:39.000 What are the things that are good for them?
02:33:40.000 So I would start either on Substack.com or go get the Substack app in the iPhone App Store, which exists now.
02:33:46.000 And the iPhone App Store, when you go to that, do you have an Android version as well?
02:33:53.000 There will be soon an Android version.
02:33:55.000 Oh, you don't have an Android version.
02:33:55.000 Very soon.
02:33:56.000 You don't like Android people.
02:33:58.000 I love Android people.
02:33:59.000 The cyborgs, they're great.
02:34:02.000 How come you haven't had an Android version yet?
02:34:05.000 Small team.
02:34:06.000 We built the iPhone one first.
02:34:07.000 Okay.
02:34:09.000 And the version of it, the application, does the app have the video on it for the podcasters and everything?
02:34:16.000 All that's on there?
02:34:17.000 Sure does.
02:34:17.000 No commercials?
02:34:18.000 No commercials.
02:34:20.000 That's beautiful.
02:34:21.000 It's pretty nice.
02:34:22.000 That's nice.
02:34:23.000 Okay, so your social media, one more time, yours?
02:34:26.000 Mine?
02:34:26.000 I think I'm cjgbest on Twitter.
02:34:29.000 You don't even know?
02:34:30.000 I love it.
02:34:31.000 I'm pretty sure.
02:34:33.000 cjgbest.
02:34:34.000 cjgbest, that's me.
02:34:35.000 Okay.
02:34:36.000 And we're at Substack Inc., yeah.
02:34:38.000 Thanks, Chris.
02:34:39.000 I really appreciate it.
02:34:39.000 Really great conversation.
02:34:41.000 Likely.
02:34:41.000 Enjoyed it.
02:34:41.000 Thank you for doing what you're doing.
02:34:43.000 It really means a lot.
02:34:44.000 Thank you.
02:34:45.000 All right.
02:34:45.000 Bye, everybody.