The Joe Rogan Experience - February 19, 2019


Joe Rogan Experience #1248 - Bill Ottman


Episode Stats

Length

2 hours and 27 minutes

Words per Minute

167.46603

Word Count

24,651

Sentence Count

2,240

Misogynist Sentences

34


Summary

In this episode of the podcast, we talk about a hoax that happened to Bill's college roommate, Joey Diaz, and what it means for the future of postmodernism. We also talk about the Grieving Studies Hoax and the postmodernist hoaxer, Peter Boghossian, and how he got fooled by a bunch of other hoaxers. We hope you enjoy this episode and that it makes you think about what it's like to be the victim of a hoax! Also, we apologize for the audio quality in this episode, we re working on fixing that. We promise it'll be better in the future. Enjoy the episode and spread the word to your friends and family about this one! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. Art: Mackenzie Moore. Editor: Patrick Muldowney. Cover art by Ian Dorsch. The theme song is by Suneaters, courtesy of Lotuspool Records. Our ad music is by Build Buildings. This episode was produced by Micah Vaynerchuk. Please rate, review, and subscribe to our podcast on Apple Podcasts, and we'll be looking out for your comments and thoughts in the next episode! Thank you to our sponsor, and our patron(s). Thanks to my good friend, . and my good vibes! for sponsoring the show. Thanks for listening and supporting the show, and for all the support us on social media support us, and the support we get us out there on the road, and all the work we do on this podcast, and thanks you're amazing, and our support, and thank you for all of our support and support we do it, and so much more! we really appreciate it, we really really appreciate you. , and we really hope you guys are amazing! - Thank you, thank you, bye! and thanks, bye, bye. - Jack, Jack, Sarah, and good vibing, and bye, and much more. xoxo. Jack, Caitie, Sarah and Sarah, - - Sarah, Caitlyn, Rachel, and Joe, Kristy, and Matt, and Jack, and everyone else, and your support is much more... Love, Caitlyn and Ben, Natalie, and Mike,


Transcript

00:00:01.000 Five, four, three, two, one.
00:00:07.000 Legit.
00:00:08.000 Hello, Bill.
00:00:09.000 Hey, man.
00:00:10.000 What's going on?
00:00:11.000 You are here.
00:00:11.000 Here.
00:00:12.000 Yes.
00:00:13.000 You got a book of shit.
00:00:13.000 With a book.
00:00:14.000 I got a book.
00:00:15.000 You come prepared.
00:00:15.000 I mean, yeah, I'm trying to write.
00:00:17.000 I'm trying to get back into handwriting.
00:00:18.000 For people who don't know, Bill is the CEO and co-founder of Minds.com, and we've been going back and forth through email, and you got hoaxed by some dude who said he was Joey Diaz.
00:00:27.000 It did happen.
00:00:28.000 He really believed.
00:00:29.000 You're like, Joey's been on my network, and I'm like...
00:00:32.000 He was messaging me in Joey's voice, basically cloning it.
00:00:37.000 There's weird people out there, man.
00:00:38.000 Well, that's not hard to do.
00:00:41.000 You watch enough Joey.
00:00:42.000 Basically just cloning his tweets.
00:00:44.000 Cocksucker.
00:00:46.000 Every Monday morning or so, there's a tweet about someone needs to suck your dick.
00:00:54.000 They need to suck your dick.
00:00:55.000 You need to let them know.
00:00:56.000 That's on the regular.
00:00:59.000 What's the notes, man?
00:01:02.000 Just some ramblings from this morning.
00:01:04.000 Yeah?
00:01:05.000 Yeah.
00:01:06.000 Important stuff.
00:01:07.000 It's actually the first thing that I've written in this notebook.
00:01:10.000 I've not been doing handwriting much at all in the last years, probably.
00:01:15.000 Mostly digital.
00:01:16.000 Which is not good, because I actually majored in English.
00:01:20.000 Yeah, you definitely lose your ability to write words.
00:01:22.000 It's funny, I tried writing in, for whatever reason, I write mostly in all caps, because I mostly just write notes, but I tried writing with lowercase letters, and then I tried writing in cursive, and my cursive is like, it's almost like I have to relearn it.
00:01:37.000 Yeah, I was finding just trailing off at the end of certain words, but I blend it all together.
00:01:43.000 You what?
00:01:44.000 I blend it all together with capital and lowercase.
00:01:46.000 Oh, why do you do that?
00:01:47.000 I mean, well, just as a normal person would, proper grammar.
00:01:51.000 I thought you were just mixing them up randomly.
00:01:53.000 No, no.
00:01:54.000 So I did write my college thesis in all lowercase.
00:01:54.000 Okay.
00:01:58.000 Why?
00:01:59.000 Typed.
00:02:00.000 We protesting?
00:02:01.000 Yeah, kind of.
00:02:02.000 It was stupid.
00:02:03.000 It's like a cool move, right?
00:02:04.000 I'm not going to use any uppercase.
00:02:06.000 Who cares, man?
00:02:07.000 There's weird postmodern theory about capitalization.
00:02:12.000 Oh, really?
00:02:12.000 And that's kind of what I was talking about.
00:02:13.000 I got a little bit indoctrinated at UVM. Really?
00:02:16.000 Yeah.
00:02:18.000 This one class was called Critical Theory.
00:02:18.000 Yeah.
00:02:21.000 Which one is UVM? Vermont.
00:02:23.000 Oh, Vermont is like super social justice-y, right?
00:02:23.000 Vermont.
00:02:28.000 And paved with good intentions.
00:02:30.000 Yeah, yeah, yeah.
00:02:31.000 They have great ice cream up there, too.
00:02:33.000 Nice folks.
00:02:34.000 But this one class was called Critical Theory, and we had to watch Buffy the Vampire Slayer.
00:02:40.000 And apply, like, Marxist theory to it to show how, like, it's the rise up of the lower class.
00:02:48.000 It's like you're forced to write these papers in, like, a certain way.
00:02:52.000 Yeah.
00:02:52.000 Yeah?
00:02:53.000 What are they trying to prove?
00:02:56.000 Class division.
00:02:58.000 Class division?
00:02:59.000 Yeah.
00:02:59.000 In Buffy the Vampire Slayer?
00:03:01.000 There's, like, books and books written about Buffy the Vampire Slayer.
00:03:01.000 It's there.
00:03:05.000 What?
00:03:06.000 Marxism.
00:03:06.000 Yeah.
00:03:06.000 Come on.
00:03:07.000 Really?
00:03:07.000 Not kidding.
00:03:08.000 Yep.
00:03:09.000 What do they have to say?
00:03:12.000 I don't remember.
00:03:13.000 I don't even want to go into it.
00:03:15.000 It's amazing.
00:03:16.000 I mean, I'm sure you're aware of James Lindsay and Peter Boghossian and what is it?
00:03:22.000 The other woman's name?
00:03:23.000 Yes.
00:03:24.000 Shit.
00:03:24.000 Helen?
00:03:25.000 I didn't meet her, unfortunately.
00:03:27.000 The Grieving Studies hoaxes.
00:03:29.000 They submitted a bunch of fake studies to these journals and not only got reviewed, but got lauded and praised for their academic scholarship.
00:03:41.000 Yeah.
00:03:41.000 Yeah, I think that stemmed from this guy Sokol who first trolled a lot of the postmodern journals.
00:03:48.000 And he was the first.
00:03:49.000 And so it's called a Sokol hoax to do that kind of trolling.
00:03:53.000 It's hard to figure out who's who, right?
00:03:54.000 It's hard to figure out what's the hoax.
00:03:56.000 I got tricked by that.
00:03:57.000 There's this one thing called the postmodernism generator online.
00:04:00.000 It's a computer that writes articles.
00:04:03.000 That puts all of this fancy language together.
00:04:06.000 And someone sent it to me, and I showed it to my teacher, and I was like, oh, this is saying something pretty interesting.
00:04:12.000 But it was nothing.
00:04:15.000 What do you make of all this hoaxing?
00:04:18.000 You've been hoaxed twice then, that you've just admitted in the first minute of the show.
00:04:24.000 I mean, I think that you sort of have to have the right to...
00:04:27.000 To hoax?
00:04:28.000 To be wrong.
00:04:28.000 To hoax.
00:04:30.000 To mess up.
00:04:31.000 Well, that's not messing up.
00:04:32.000 That's deceiving people.
00:04:34.000 True.
00:04:35.000 But you kind of have the right to do that, too.
00:04:37.000 You kind of have the right to troll.
00:04:38.000 Yeah, you don't have the right to impersonate, but I have the right to get...
00:05:00.000 It doesn't make sense?
00:05:09.000 Like if someone is purposely...
00:05:11.000 Okay, let's say that you find some Chinese bot that's purposely disseminating incorrect and negative information about maybe a potential presidential candidate.
00:05:23.000 Let's pick one.
00:05:24.000 Tulsi Gabbard.
00:05:25.000 They're disseminating fake news about her.
00:05:28.000 You know for sure that it's fake.
00:05:30.000 You know for sure...
00:05:31.000 I don't know how you know, but you know for sure who the source of it is.
00:05:34.000 You don't think that should be taken down?
00:05:36.000 I think that if it's illegal, it should be taken down.
00:05:39.000 If it's illegal?
00:05:40.000 Yeah.
00:05:40.000 Okay.
00:05:41.000 But if it's just lies?
00:05:43.000 Then that could be illegal.
00:05:45.000 I'm not a lawyer.
00:05:47.000 So I'm trying to position the network or just like advocate for other networks to take more of a neutral stance.
00:05:55.000 There's this cool thing.
00:05:57.000 It's called the Manila Principles, which the Electronic Frontier Foundation wrote with a bunch of other internet freedom groups, which is talking about how digital intermediaries shouldn't be making these subjective decisions about what's getting taken down and should require a court order.
00:06:12.000 Now, with a DNS provider or something less content-focused...
00:06:19.000 Explain DNS to people who don't know what you're talking about.
00:06:21.000 A domain name, yeah.
00:06:24.000 Explain what that is.
00:06:25.000 It's like where you buy a domain.
00:06:28.000 But for a social network, we're hosting tons of content.
00:06:31.000 So it's harder for us because we see illegal content and we should proactively take some of it down if we know it's illegal.
00:06:38.000 But at the same time, it's just slippery.
00:06:42.000 Well, I follow quite a few people who have...
00:06:44.000 I don't want to say they have hoax accounts, but they have parody accounts.
00:06:48.000 And, you know, a lot of people read into it wrong and think they're being honest and argue with them.
00:06:53.000 Like, there's this progressive dad guy on Instagram.
00:06:57.000 You ever follow him?
00:06:59.000 I know exactly what you're talking about.
00:07:00.000 He's hilarious.
00:07:00.000 The one I follow too, that's pretty funny.
00:07:02.000 I follow quite a few of them.
00:07:04.000 There was a guy who was the wrong skin guy who was saying he was born in the wrong color skin, that he's transracial, and he would make these ridiculous arguments about it.
00:07:13.000 And Elwick, what was his name?
00:07:15.000 Apparently he's a comic from the show.
00:07:16.000 I was just thinking, this sort of started, I don't know the moment they had to do it, but let's say four years ago, a lot of those troll accounts had to sort of say, we're not...
00:07:46.000 That's funny, yeah.
00:07:47.000 Well, what's your stance on people who make accounts of your stuff, put it out there?
00:07:51.000 Well, I have a lot of them.
00:07:54.000 There's a ton of them.
00:07:55.000 Some of them actually do good stuff.
00:07:57.000 They make little clips, and they put those clips online, and it's good for people to enjoy the show.
00:08:02.000 They get a little one-minute snippet of things.
00:08:05.000 And then some of them will pretend to be me and contact people and try to book them on the show, which is really weird.
00:08:12.000 Yeah, I've had that.
00:08:15.000 But, you know, I mean, who is that?
00:08:16.000 Is that a 16-year-old kid in Indiana?
00:08:18.000 I mean, who is that?
00:08:20.000 It's odd.
00:08:21.000 But it's overall, like, get past my own personal feelings because it's about me.
00:08:29.000 It's interesting.
00:08:31.000 This strange new ground that we're covering.
00:08:37.000 We've been discussing this ad nauseum on the podcast lately that essentially we've been dealing with 20 years of this.
00:08:43.000 And in those 20 years, it's changed radically.
00:08:47.000 What it is, it's become something completely different.
00:08:51.000 It's become something that changes public opinion on things overnight.
00:08:55.000 It's become something where you can distribute information from person to person about some huge international news event.
00:09:07.000 You can get all of your information from Twitter, whether it's what happened in Venezuela or anywhere there's something in the world.
00:09:14.000 People are turning to social media almost before they're turned anywhere.
00:09:19.000 When I hear about something, I almost always, before I even Google it, I almost always go to Twitter and check Twitter and see what's going on.
00:09:28.000 DuckDuckGo it.
00:09:29.000 DuckDuckGo?
00:09:30.000 Have you heard of that one?
00:09:31.000 No, what's that?
00:09:31.000 It's like a privacy-focused search engine.
00:09:34.000 It's pretty much the only privacy alternative to Google.
00:09:39.000 It's like this idea that we say, oh, just Google it.
00:09:41.000 Right.
00:09:44.000 Our whole process has been to like purge proprietary surveillance tools from our company.
00:09:51.000 And I've been trying to do it myself, like getting off Facebook, getting off Twitter, getting off Instagram.
00:09:57.000 It's just like they're so abusive to everybody.
00:10:00.000 And it's like there's brilliant people who work there.
00:10:03.000 I mean Instagram is such a well-designed app.
00:10:05.000 Are you kidding me?
00:10:06.000 Beautiful.
00:10:07.000 So what do you think is abusive about it, particularly?
00:10:11.000 Let's start with Twitter.
00:10:13.000 They're all the same.
00:10:14.000 Do you think they're all the same because they're all gigantic businesses?
00:10:20.000 Yeah.
00:10:21.000 And they're all the same because none of them share their source code.
00:10:24.000 And they all spy on everybody.
00:10:26.000 And they don't show you what is happening behind the scenes.
00:10:30.000 They don't show you what the code's doing.
00:10:31.000 So like in that note I wrote to you the other day, it's like, I compare it to like food transparency.
00:10:38.000 You know, 50 years ago, nobody thought about that.
00:10:42.000 And then 20 years ago, everyone's like, I want to know what's in my food.
00:10:46.000 But why wouldn't you want to know what's in your apps?
00:10:48.000 I mean, it's super sketchy what they're doing.
00:10:53.000 But how so?
00:10:54.000 What superscription?
00:10:55.000 We don't know.
00:10:55.000 But we know that they're spying on everyone and tracking you everywhere you go.
00:10:58.000 They're targeting things at you based on physical location, browser, history.
00:11:03.000 Even when you're not on those websites, they're following you around where you're going on the internet.
00:11:06.000 Right.
00:11:07.000 And so some people accept that for this free search engine with free email and things along those lines.
00:11:14.000 They accept the fact that a certain percentage of what they're doing is not going to be private.
00:11:19.000 Or at least...
00:11:21.000 Their searches are not going to be private.
00:11:23.000 Like, say if you search, like, you're thinking about buying a Jeep, and you search Jeeps, you look at, you know, 2019 Jeep, and then all of a sudden all your Google ads are about Jeeps.
00:11:31.000 Right.
00:11:32.000 They're like, we know.
00:11:33.000 We know you're thinking about a Jeep, Bill.
00:11:35.000 And I don't think that that makes people want to spend more time on Google and Facebook.
00:11:40.000 What do you think it does?
00:11:41.000 Do you think it freaks them out?
00:11:42.000 I think that we're just numb to it, and so we accept it.
00:11:46.000 I think it's more than that.
00:11:46.000 Yeah.
00:11:47.000 Yeah, and so there's all different layers of like what we use with your browsers, your apps, your operating system, your food, your, you know, government, your energy,
00:12:04.000 like all of this technology.
00:12:07.000 It has code that's associated with it.
00:12:11.000 And when you open up your computer, when you sign into a browser, when you open up an app, you are empowering that app.
00:12:18.000 That's how the apps of the world become huge, monstrous corporations, is because we all use them every day.
00:12:23.000 So if you switch from Mac OS to GNU Linux or Debian or Ubuntu, if you use Brave or Firefox, if you DuckDuckGo is actually proprietary, which is annoying, but they are very privacy-focused.
00:12:40.000 And then there's apps.
00:12:41.000 There's Mines.
00:12:43.000 There's other open-source, decentralized social networks out there that we can potentially federate with.
00:12:49.000 There's really cool, new, interesting protocols like DAT and IPFS that are more torrent-style back-end.
00:12:56.000 So there's actually no servers available.
00:12:59.000 In a giant warehouse like Facebook and Google, it's fully peer-to-peer.
00:13:05.000 And we're trying to balance it because it's not like decentralization equals good and centralization equals bad.
00:13:11.000 But in order to get a sweet app like Instagram-style, you need servers to process video.
00:13:18.000 And so the tech is still sort of immature in the fully peer-to-peer Bitcoin-style environment.
00:13:24.000 But we're definitely getting there.
00:13:26.000 And I just think it's important for people to use things that are transparent to them and respecting our freedom.
00:13:34.000 Yeah, I think one of the problems with these giant companies is that once they become big, you kind of use them as a default, and it's very difficult to get people to communicate with you off of them.
00:13:43.000 It's hard to say, hey man, I'm launching this new social media app.
00:13:48.000 I would imagine you could speak to this.
00:13:50.000 I'm launching this new social media app, and I want you to join it.
00:13:54.000 People are like, but I'm already on fucking Facebook.
00:13:56.000 I'm already on Google.
00:13:57.000 I'm already on Instagram.
00:13:58.000 I don't want to do that, man.
00:14:00.000 Too much.
00:14:01.000 Too much extra.
00:14:02.000 And we make it a million times harder for ourselves because we're not scooping into people's contacts.
00:14:07.000 And, you know, taking all their information.
00:14:09.000 That you're not.
00:14:10.000 We're not.
00:14:10.000 No.
00:14:11.000 Okay.
00:14:11.000 So, like, when you give your address book to an app.
00:14:15.000 Who does that?
00:14:16.000 Most apps.
00:14:17.000 You've got to be an asshole.
00:14:18.000 No, but when you say, oh, I want to find my friends who are on this app, and you share your contacts.
00:14:23.000 Well, you're not supposed to do that.
00:14:24.000 You're not supposed to do that, yeah.
00:14:25.000 But most people do.
00:14:26.000 And, you know, your friends didn't give you permission to give Facebook their phone number.
00:14:30.000 Do you do that?
00:14:32.000 I probably used to, like, seven, eight years ago, but I don't do it anymore.
00:14:36.000 I always say the same thing when it pops up.
00:14:38.000 Get the fuck out of here.
00:14:39.000 That's always what I say.
00:14:40.000 Would you like to share your contacts?
00:14:42.000 Get the fuck out of here.
00:14:44.000 No, you can't have my contacts, you asshole.
00:14:47.000 I know what you're doing.
00:14:48.000 Facebook is a weird one, man.
00:14:50.000 It's such a sneaky one.
00:14:53.000 Facebook and all the congressional hearings and the inner workings of it all.
00:15:01.000 The fact that it profits off of outrage, so it wants people to argue.
00:15:09.000 The AI, the computer learning, specifically wants people to have contentious debates about things because that keeps their eyes focused on the website.
00:15:20.000 And if your eyes are focused on Facebook, then those Facebook ads are very valuable.
00:15:26.000 It's really fascinating, Matt.
00:15:28.000 I think the outrage is unavoidable on any network.
00:15:32.000 It's more, you know, are you going to take down?
00:15:35.000 They're taking down outrage.
00:15:38.000 Some, yeah, sure.
00:15:39.000 And it just seems so inconsistent and subjective how they're applying.
00:15:44.000 I mean, even just yesterday, I think some journalists got banned from Facebook.
00:15:49.000 Yeah, you aware of the story?
00:15:50.000 Yeah.
00:15:50.000 I'm going to send this to you, Jamie, because it's a really crazy one.
00:15:54.000 Because they wanted her to show who her funding sources were, and I didn't even know that there was an area where you could show that.
00:16:02.000 So it's almost like they're making this up as they go along.
00:16:05.000 Yeah.
00:16:05.000 Yeah, Kyle Kalinske sent me this today.
00:16:07.000 I'm going to send this to you right now, Jamie.
00:16:09.000 Hang on one second.
00:16:16.000 Hold on.
00:16:17.000 I'm very quiet.
00:16:19.000 Unfortunately, this is an audio show.
00:16:20.000 This is live air.
00:16:21.000 Yes.
00:16:22.000 Not dead air.
00:16:23.000 There you go.
00:16:24.000 I just sent it to you.
00:16:24.000 Okay, buddy.
00:16:25.000 Okay.
00:16:26.000 Facebook suspended in the now tweets page.
00:16:32.000 At the behest of CNN and the U.S. government-funded think tanks, it says we had almost 4 million subscribers, did not violate Facebook rules, were given no warning, and Facebook isn't responding to us.
00:16:45.000 So yeah, what actually started this off?
00:16:50.000 I mean, who knows?
00:16:52.000 They don't communicate with anyone.
00:16:54.000 They've been banning legit accounts for years.
00:16:58.000 You cannot even send a Minds.com link through Facebook Messenger right now.
00:17:02.000 It's blocked.
00:17:03.000 What?
00:17:03.000 Yeah.
00:17:04.000 What?
00:17:04.000 If you post in the news feed, it says, careful, this could be an unsecure website.
00:17:11.000 Oh.
00:17:14.000 I clicked on a link from TMZ yesterday and got the same thing on Twitter.
00:17:17.000 Twitter said this might be malicious, there's spam, there could be...
00:17:21.000 No, it's from a TMZ link.
00:17:21.000 From mines?
00:17:23.000 It was clicking, like this story is on TMZ. Here, do you want to see the rest of the story?
00:17:27.000 So they're trying to keep you from going to TMZ? Yeah, I don't know why.
00:17:30.000 It was the first time I've ever seen that.
00:17:32.000 It's probably caught up in some algorithm.
00:17:34.000 I sent an actual written letter to Facebook about it.
00:17:38.000 Obviously, they don't get back.
00:17:39.000 There's no human activity.
00:17:41.000 You wrote it with a piece of paper?
00:17:41.000 A written letter?
00:17:42.000 No, no.
00:17:43.000 That would have been cool.
00:17:44.000 I signed it with ink.
00:17:45.000 Really?
00:17:45.000 Yeah, no.
00:17:46.000 Because our lawyer said that actually proves that you sent them something, some sort of diligence.
00:17:54.000 But there's just no recourse.
00:17:57.000 Right.
00:17:59.000 They're lost.
00:18:00.000 So, explain.
00:18:02.000 So, if someone is trying to say on Facebook Messenger, hey, you should go check out Minds.com, it won't let you post that link?
00:18:09.000 Nope.
00:18:10.000 And what is their excuse?
00:18:12.000 No, they don't tell you.
00:18:12.000 They don't tell you?
00:18:13.000 So, is it because you're a competing social media network?
00:18:15.000 I don't know.
00:18:16.000 I don't want to get into...
00:18:18.000 I don't know.
00:18:19.000 You don't know.
00:18:20.000 I'm not going to say that.
00:18:21.000 But you just know that it does.
00:18:22.000 Yeah.
00:18:22.000 Yeah.
00:18:23.000 You don't know why it does, but you know it does.
00:18:25.000 And they're calling us unsecure, and I'm pretty sure that Facebook got hacked.
00:18:29.000 You know, they compromise everybody's data.
00:18:31.000 Like, you want to talk about unsecure, there's no more unsecure site that exists.
00:18:39.000 It is kind of funny, right?
00:18:41.000 I mean, after those hearings and after all the Russia stuff...
00:18:47.000 Yeah, it is kind of funny calling somebody else insecure.
00:18:50.000 Yeah, they're insecure.
00:18:52.000 Mark Zuckerberg is very insecure.
00:18:53.000 Well, he's also stupid rich.
00:18:55.000 He seems like he's too rich, like he fucked up.
00:18:58.000 Like he's there sipping water like a robot, trying to figure out what the fuck he's doing with his life.
00:19:02.000 I think that they're scared because they know they've betrayed everybody, and so it's hard to get them to speak.
00:19:10.000 You know, it's interesting with Dorsey here, because I give him credit for speaking, but But the fact is that he's not answering the questions.
00:19:21.000 Well, he's bringing somebody else in to answer the questions in the next go-round.
00:19:24.000 And so that should be very interesting.
00:19:26.000 And you think he actually didn't know the answer to those questions?
00:19:27.000 I think he probably doesn't know all the specifics because he's a CEO of not one but two different corporations.
00:19:34.000 He's busy as shit.
00:19:35.000 And also rich as fuck.
00:19:38.000 True, but I think that when we look at the policy that exists on these networks, he is in control of the policy to a large degree.
00:19:47.000 There's a board, there's a decision-making process, but he has a large voice.
00:19:51.000 Okay, I don't know how large his voice is.
00:19:53.000 I assume that's probably true, but one of the things we did detail on the last podcast with Tim Pool was how he wasn't the CEO for quite a long time.
00:20:00.000 Yeah, he got fired and then rehired at some point.
00:20:02.000 Yeah, so obviously there's some contention, there's some issues, and there's a lot of money involved in these things, and I think that plays a giant part in how they decide to make decisions.
00:20:02.000 Yeah.
00:20:13.000 But do you think that an advertiser, in reality, doesn't, like, say you're an advertiser and you want to advertise your computer.
00:20:23.000 Okay.
00:20:23.000 And there's a video on YouTube that is about something controversial.
00:20:28.000 Does it actually make sense for that advertiser to not show their product on that controversial video?
00:20:35.000 Don't they want to sell computers?
00:20:37.000 Well, it depends.
00:20:39.000 I mean, if the controversial videos are about how Jews are evil, and you have this video about Jews being evil, and then you're like, buy Razer computers!
00:20:47.000 Come on!
00:20:48.000 Right, but do you think that people actually...
00:20:51.000 I can understand not wanting to support certain types of content.
00:20:55.000 And maybe advertisers feel like they're supporting that content by advertising next to it.
00:21:01.000 But I also don't think that people, when they're watching a controversial video on the internet, say, oh my gosh, you know, this advertiser is completely out of line for being next to this controversial thing.
00:21:14.000 I don't think that's a healthy direction to move.
00:21:17.000 Well, okay.
00:21:17.000 That's one way to look at it.
00:21:18.000 Another way to look at it is if you are a giant company that sells things.
00:21:23.000 Let's say you're Toyota and you're selling Tundras.
00:21:25.000 You don't want your Tundras to be associated in any way with something that you might think is negative.
00:21:30.000 It's their prerogative.
00:21:31.000 They're paying for advertising.
00:21:33.000 They can kind of decide.
00:21:34.000 This is one of the things that's leading YouTube in specific.
00:21:39.000 And I've had...
00:21:41.000 I've had a ton of conversations about this.
00:21:43.000 It's leading them specifically to try to demonetize things that could be considered distasteful or insensitive or controversial.
00:21:53.000 And it's very frustrating to content creators.
00:21:56.000 When you talk to them, they're essentially saying that they need to do better and that their tools are very blunt.
00:22:03.000 That they don't really have the correct computer learning tools to figure out what is offensive and why.
00:22:10.000 And then there's a human review system, which is very weird.
00:22:12.000 And we've run into that many times.
00:22:15.000 We'll have a podcast with, say, Tom Papa, who's an uncontroversial, fantastic stand-up comedian, and it's demonetized.
00:22:22.000 And then we're like, why?
00:22:24.000 What happened?
00:22:24.000 And then we go, what the fuck can we talk about?
00:22:26.000 We didn't talk about anything crazy.
00:22:27.000 And it's really damaging for brands when it gets demonetized right away because it's that initial time period that generates the most revenue.
00:22:34.000 So when you have to go back and do it...
00:22:36.000 I mean, so I agree with that.
00:22:40.000 So we built a tool that's like a peer-to-peer advertising tool.
00:22:43.000 So there's two options.
00:22:46.000 So you earn...
00:22:48.000 Crypto for your contributions.
00:22:50.000 And then...
00:22:50.000 Which cryptos do you support?
00:22:52.000 We have an Ethereum-based token.
00:22:54.000 But we're going to support all of them.
00:22:56.000 So what is an Ethereum-based token?
00:22:58.000 So it's an ERC-20 token.
00:22:59.000 What does that mean?
00:23:00.000 It means that we basically reward people for all of their activities.
00:23:04.000 Okay.
00:23:05.000 So like, say, if Jamie's posted on Mines and people love his posts, he gets rewarded in some...
00:23:11.000 Yeah.
00:23:11.000 How much?
00:23:12.000 How much you get?
00:23:13.000 Well...
00:23:14.000 Can I go buy a house?
00:23:15.000 One token will give you a thousand impressions.
00:23:17.000 Oh.
00:23:18.000 So we're not focused on like, oh, you're going to make money from this.
00:23:21.000 That's not what we're saying.
00:23:22.000 One token will give you a thousand impressions or you get a thousand impressions from, you get a token from one thousand impressions.
00:23:30.000 When you use a token to advertise on Mines, you get a thousand impressions when you boost your posts with it.
00:23:36.000 So wait a minute, if you use the crypto, you use a token, you guarantee views?
00:23:42.000 Yeah.
00:23:43.000 That's weird, isn't it?
00:23:44.000 Why?
00:23:46.000 Well, you're guaranteeing people see something?
00:23:49.000 Well, we...
00:23:51.000 When you boost it, it gets fed to people's news feed chronologically.
00:23:54.000 Right, I see.
00:23:55.000 So, there's just a backlog.
00:23:57.000 So, sort of like when Instagram has those sponsored posts.
00:24:00.000 Except we're not spying on people when we send them.
00:24:03.000 Instagram spies on people, too?
00:24:04.000 Oh, my...
00:24:05.000 I don't know, man.
00:24:06.000 I'm stupid.
00:24:07.000 Help me out.
00:24:08.000 Yes, they do.
00:24:08.000 It's...
00:24:11.000 And the thing is, we just don't know.
00:24:13.000 So this is where free and open source software is just essential.
00:24:17.000 Like, the big networks, there's no excuse for them not to be sharing their software.
00:24:24.000 Right.
00:24:24.000 It's like, when you're a public forum on that scale...
00:24:28.000 The community just has a right to know what the algorithms are doing.
00:24:31.000 So you think that they're not sharing their software because their software is encoded and designed to spy on you and extract information and sell that information?
00:24:41.000 Partially.
00:24:42.000 Like when Jamie gives up your contacts.
00:24:44.000 When he signs up for an app and he says, yes, you can get access to all my contacts.
00:24:49.000 There's a lot of reason and they don't want people to compete with them.
00:24:51.000 Like anyone could actually take all of our code and make their own social network and compete with us.
00:24:56.000 They could set up on their own servers.
00:24:58.000 And we encourage that.
00:24:59.000 That's what the Fediverse is called.
00:25:02.000 That's what Elon Musk does with Tesla.
00:25:04.000 All of his electric patents for electric cars.
00:25:07.000 I think that he opened up the patents.
00:25:08.000 I don't think he open sourced all the code of the car.
00:25:13.000 But he's definitely moving in the right direction.
00:25:16.000 He wants to build the market.
00:25:18.000 Yes, and he also wants to save the world.
00:25:20.000 I mean, he legitimately has this, and he also has a shitload of money.
00:25:23.000 He's got enough money.
00:25:25.000 I think that's a big factor with those guys.
00:25:28.000 But don't you think that it's almost like...
00:25:32.000 It's going to help.
00:25:33.000 Whatever network does that, is more transparent, stops spying on people, is more community-run and evolved.
00:25:43.000 Wouldn't that be the network that you would think humanity would want to stick with in the long term?
00:25:48.000 Wouldn't that be a good move of that?
00:25:50.000 Yes and no.
00:25:51.000 For the average person, what are they losing when they get on Facebook or Google?
00:25:58.000 What's bad?
00:25:59.000 Well now their likes are going down.
00:26:01.000 Everybody's likes are going down and that makes everyone very sad.
00:26:04.000 What do you mean?
00:26:05.000 Well the algorithms, you're only reaching 5% of your own followers organically on Facebook now.
00:26:11.000 And they're starting to change the chronological feed on Instagram too.
00:26:15.000 And they know that this causes depression and they're still doing it because they know that they think they're better at showing you what you want to see than you are.
00:26:27.000 And they want to make money from it.
00:26:29.000 What do you mean by they know that this causes depression?
00:26:31.000 They've done studies about mental health in relation to...
00:26:36.000 Actually, Facebook got exposed like five years ago for doing a secret study on...
00:26:41.000 On like a few million users where they were injecting both positive and negative content into the newsfeed and they proved that they could affect people's moods.
00:26:50.000 This was with Princeton.
00:26:51.000 There's a huge backlash and they're like, oh sorry.
00:26:54.000 Whoops.
00:26:56.000 Right, but this isn't injecting negative or positive content.
00:27:00.000 This is just moving these images or these posts around so that less people see them?
00:27:06.000 There's two different topics there.
00:27:08.000 The basic newsfeed on Facebook is now a mysterious conglomeration of thousands of variables, which we don't know.
00:27:17.000 But additionally, like a few years ago, they were exposed for having been experimenting with people's brains.
00:27:25.000 That's right.
00:27:26.000 I remember that now.
00:27:27.000 I remember that now.
00:27:28.000 That's right.
00:27:29.000 Yeah, I remember thinking, like, wow, that's kind of creepy.
00:27:33.000 They're experimenting on the people that are on their site, and they're not telling these people they're experimenting on them.
00:27:39.000 Yeah.
00:27:40.000 But, I mean, if they're trying to make it better...
00:27:44.000 Do you think that's a factor?
00:27:49.000 How does it cause depression if your images or your posts are not being seen by as many people?
00:27:57.000 Have you talked to kids posting on social media and their reactions to how many likes they're getting?
00:28:06.000 They get very, very concerned.
00:28:11.000 Well, that seems like more of a problem with that.
00:28:14.000 It is on both sides.
00:28:17.000 Being addicted to likes as some sort of a...
00:28:20.000 It's a weird dopamine hit, right?
00:28:23.000 It's not healthy.
00:28:24.000 We need to learn to not care about that.
00:28:27.000 But...
00:28:28.000 I think that the core purpose of a social network is to subscribe to someone and see their stuff.
00:28:33.000 And when people subscribe to you, they see your stuff.
00:28:36.000 So when you spend years building up a following on social media, and say you earn 100,000 followers or something, and then suddenly the network says, nah, your friends can't see that anymore.
00:28:51.000 That's not cool.
00:28:53.000 And even Twitter's default newsfeed is no longer chronological.
00:28:57.000 You have to click it to go chronological, and then it defaults back to their weird algorithm thing.
00:29:03.000 So we're saying, look, 100% organic, chronological, raw, forever as default.
00:29:08.000 And then if you want to curate algorithms or have recommended stuff come in as an alternative, fine.
00:29:14.000 But that is the core purpose of social media, is to connect with people that follow you and the other way around.
00:29:20.000 What do you think the purpose is?
00:29:22.000 Why do you think Facebook would decide to have things not in chronological order and only be seen by 5% of your followers?
00:29:29.000 What would be the benefit in that for them?
00:29:32.000 Revenue, how so?
00:29:32.000 Revenue.
00:29:34.000 How does that generate revenue?
00:29:35.000 They just know that they can keep you on the app better.
00:29:40.000 If you get less likes?
00:29:42.000 No.
00:29:43.000 If your stuff is seen by less people?
00:29:44.000 It doesn't make sense.
00:29:46.000 That's a good point.
00:29:47.000 It sort of works both ways.
00:29:49.000 I think that they think they know the people that you're going to react to the most.
00:29:55.000 So as a consumer, when you're getting that content, you know, the algorithms are showing you what you typically like.
00:30:03.000 Have you noticed that?
00:30:04.000 I'm really not paying much attention, but I believe you.
00:30:08.000 So, yeah, for creators, it's hurting creators.
00:30:11.000 People who post are getting hurt.
00:30:14.000 People who are sitting there just scrolling, they're the ones who are really getting, you know, addicted.
00:30:21.000 More so with the algorithms.
00:30:24.000 So how are the people that are posting getting hurt?
00:30:25.000 They're getting hurt because their stuff is being seen by less people?
00:30:28.000 Yeah.
00:30:28.000 Because it's not chronological and it's not organic because it's curated.
00:30:33.000 But aren't they doing it because they think it's going to be a better and an experience that's more conducive to your likes?
00:30:39.000 That's what they say.
00:30:40.000 What do you think they're doing it for then?
00:30:41.000 They're doing it because they...
00:30:44.000 Have studied, through looking at the data, how to keep people on the app more.
00:30:49.000 Right, and that way is to give them...
00:30:51.000 Like, say if I Google or if I look at muscle cars on Instagram.
00:30:54.000 Now, if I go to my search, it's all muscle car stuff.
00:30:57.000 So that's what it is.
00:30:58.000 They say, oh, he likes that.
00:30:59.000 So we're just going to give him a lot of that.
00:31:01.000 And I think that's okay as an alternative feed.
00:31:05.000 Or to put that somewhere...
00:31:07.000 I just think the core feed always needs to stay pure.
00:31:11.000 Because otherwise you're just...
00:31:13.000 Down the slippery slope again.
00:31:15.000 I understand.
00:31:16.000 They're injecting things into your head that you didn't ask for.
00:31:20.000 Right, and they're doing it because they want to keep you around.
00:31:22.000 Yeah, that makes sense.
00:31:25.000 How many different companies are subscribing to that?
00:31:29.000 It seems like all the big ones we're saying are curating and moving things around and all the big ones have an algorithm that's designed to keep you on board, right?
00:31:39.000 And that's okay to pursue.
00:31:40.000 I think there's really cool things you can do with AI and machine learning and algorithms that...
00:31:45.000 Is really beneficial.
00:31:47.000 But it's just taking away people's reach when they have worked years and years to achieve it, it's not okay.
00:31:54.000 Do you think that this is this marriage between something that is this social media network that's designed to allow people to communicate with each other and then commerce, like this business, like how do we maximize this business?
00:32:07.000 How do we get more profit out of this business?
00:32:08.000 How do we get these people to engage more?
00:32:11.000 And then they start monkeying with the code and screwing with what you see and what you don't see.
00:32:16.000 You think that's what's happening?
00:32:17.000 Yeah.
00:32:18.000 But in the short term, it's probably working.
00:32:20.000 But in the long term, they're betraying everybody's trust.
00:32:23.000 It has to be more of a consent-based system.
00:32:25.000 So, you know, at least give people – well, it should be opt-out by default.
00:32:30.000 And fine, give me messages to opt-in so that you can show me certain things.
00:32:35.000 But this whole forcing people into surveillance, it just has to stop.
00:32:44.000 It's super scary.
00:32:46.000 How's it super scary to you?
00:32:48.000 It's just too much power.
00:32:51.000 Yeah.
00:32:51.000 It's too much power for something that's supposed to be silly, right?
00:32:54.000 Like, what was Facebook supposed to be?
00:32:56.000 It was supposed to be some silly thing that you just can communicate with friends.
00:32:59.000 It was, but from the beginning, none of these networks have ever really been about the people of the networks.
00:33:06.000 It's always been closed source since the inception.
00:33:10.000 But then look at open networks out there.
00:33:13.000 You have Wikipedia.
00:33:14.000 Totally open source, community run.
00:33:17.000 Granted, they have their issues with moderation, fine.
00:33:20.000 But it's a top 10 website in the world.
00:33:22.000 It's totally open source.
00:33:24.000 Creative Commons content, incredible human achievement.
00:33:28.000 Bitcoin, open source money.
00:33:31.000 WordPress even is an open source CMS system that is like powering 25% of the internet.
00:33:38.000 So why wouldn't that happen with social media?
00:33:40.000 It should.
00:33:41.000 I mean, this is where everyone's hanging out.
00:33:43.000 So we should all sort of collectively even own it.
00:33:46.000 We did an equity crowdfunding round.
00:33:48.000 So like thousands of members of our community actually own the site.
00:33:52.000 Now, how many people are on Mines?
00:33:54.000 We have like a million and a half registered, like quarter million active.
00:33:59.000 We're small.
00:34:00.000 But the weird thing is that Even though we're a fraction of the size, especially smaller creators who come get better reach on minds than they do on Facebook and Twitter because we have this reward and incentive system sort of like gamified where you earn reach and you earn more of a voice for contributing.
00:34:21.000 So like you could have an account on Twitter for 10 years and post thousands and thousands of tweets and you never hit that viral nerve and you just never really get much exposure.
00:34:32.000 So we're trying to help people be heard.
00:34:35.000 And so you'll find a small creator who on other networks has no followers, have thousands and thousands of followers on Minds.
00:34:42.000 And what do you think you would like to do with Minds in the future that you haven't been able to do yet?
00:34:49.000 Engineer the control out of ourselves so that we aren't even in a position to really take people's stuff down or What if someone posts your house and your information,
00:35:05.000 where your kids go to school?
00:35:07.000 I think that on the central servers, obviously, yes, we're always going to moderate.
00:35:14.000 And if it's legal, it can stay.
00:35:16.000 If it's not illegal, it can't.
00:35:18.000 But a decentralized social network is definitely where we have to go.
00:35:24.000 Because, and yeah, okay, it's scary.
00:35:27.000 And you know, you've talked about this, like, things are getting more transparent.
00:35:31.000 This is sort of like the inevitable evolution of technology.
00:35:35.000 I mean, how many hours a day do you stream?
00:35:36.000 A couple?
00:35:38.000 You know, 25 years ago, would you have thought you'd be sharing, you know, 20% of your life live streaming to, you know, millions of people?
00:35:47.000 Like, our lives are becoming more transparent just inevitably.
00:35:51.000 It's just pulling us.
00:35:52.000 Yeah, I agree.
00:35:53.000 So, you know, Bitcoin, crypto, DAT, torrent-type architecture, that is just where we're going.
00:36:06.000 Because it's more resilient.
00:36:08.000 It's less censorship prone.
00:36:11.000 There's just benefits of it.
00:36:12.000 I think that we can balance it too.
00:36:14.000 Like maybe when you post, you have a decision.
00:36:16.000 Do you want to be able to delete this at any point?
00:36:19.000 Alright, fine.
00:36:20.000 Then you can post to the central server.
00:36:22.000 Do you want this to get unleashed?
00:36:23.000 Yeah, it's scary because, you know, there's scary stuff on the internet.
00:36:27.000 It's already like that.
00:36:29.000 But, you know, getting into censorship more, does censorship even solve the problem?
00:36:36.000 Or does it make it worse?
00:36:37.000 What problem?
00:36:38.000 The problem of crazy content, illegal content.
00:36:41.000 How could it make it worse?
00:36:42.000 Well, I mean, it seems like it can often amplify radicalization.
00:36:47.000 It definitely can, right?
00:36:48.000 And it definitely, when you censor people, it just makes them aware that there's a plot against them too, right?
00:36:48.000 Yeah.
00:36:56.000 A lot of conservatives on Twitter are finding that.
00:36:59.000 Sam Harris actually just sent me an article.
00:37:02.000 It was detailing the bias against conservatives on Twitter that they've actually done, you know, like some real studying it, and it's pretty demonstrable.
00:37:14.000 Demonstrable?
00:37:14.000 It affects both the left and the right.
00:37:16.000 Demonstrable?
00:37:17.000 Yeah, the way I'm saying it wrong.
00:37:19.000 But it affects the left and the right for sure.
00:37:22.000 That's what Kyle was saying.
00:37:23.000 I watched that video that he did.
00:37:24.000 It's anti-establishment that seems to be getting targeted.
00:37:28.000 And so, you know, Abby's been censored on Facebook.
00:37:31.000 Abby Martin.
00:37:33.000 And yeah, this person today.
00:37:35.000 I mean, most of the stuff coming out of RT is progressive, which is weird.
00:37:39.000 And who knows what kind of...
00:37:41.000 Games are getting played behind the scenes with the rush.
00:37:43.000 I mean, who knows?
00:37:45.000 But the point is, they have a right to be there.
00:37:48.000 And I mean, look at this is not YouTube's fault.
00:37:51.000 But remember the YouTube shooter?
00:37:53.000 I mean, she thought she was getting censored on YouTube.
00:37:56.000 And she went and brought a gun to the YouTube headquarters.
00:38:00.000 Like, people get pissed when they get censored.
00:38:05.000 It affects you.
00:38:06.000 Right, but in her case, you're talking about a crazy person that wasn't really being censored.
00:38:09.000 Oh, of course, but there's crazy people out there.
00:38:10.000 Yeah.
00:38:11.000 No, she wasn't being censored.
00:38:12.000 Well, she was getting censored just like everybody else is getting soft-censored on these networks.
00:38:16.000 Well, she just thought she wasn't getting promoted the way she wanted to.
00:38:20.000 I don't think anybody was actively doing anything to her.
00:38:23.000 No, I'm not saying that.
00:38:24.000 Her stuff was terrible.
00:38:25.000 I'm saying that the soft censorship of the algorithms, people getting demonetized, this has an impact on psychology.
00:38:32.000 I see what you're saying.
00:38:32.000 Right.
00:38:34.000 I'm not saying they were deliberately targeting her.
00:38:37.000 It's horrible what happened.
00:38:41.000 So what you're saying is that these algorithms that they use in order to maximize their revenue and give people things that they like but actually takes away from things being posted chronologically, keeps certain things from being seen by as many people, so it keeps them from being as viral,
00:38:58.000 so it keeps the whole thing from being organic.
00:39:01.000 Yeah.
00:39:02.000 Makes sense.
00:39:02.000 Yeah.
00:39:03.000 Yeah, it gets to that point where we're realizing that all of these things, all these social media things, are really recent.
00:39:11.000 We've only had them for a few years, and we don't necessarily know what the rules should or shouldn't be.
00:39:17.000 So it's good.
00:39:18.000 I mean, it's one of the reasons why I wanted to have you on.
00:39:20.000 I wanted to find out where these upstarts or these new people that are coming into the game, like mine's, like where you're coming into the game from and what is your position on what's wrong with the current state of affairs.
00:39:33.000 Yeah, and look, there is messed up stuff on social media.
00:39:39.000 We'll get pigeonholed into being like, oh, you support all of this crazy stuff.
00:39:44.000 First of all, most of the users online are artists, musicians, filmmakers, activists, journalists, just trying to get their content out there.
00:39:51.000 There's a very tiny minority of actually crazy content.
00:39:56.000 When you say crazy content, what do you mean?
00:39:58.000 Alt-right?
00:40:00.000 I'm not even going to make decisions on what is and isn't crazy.
00:40:04.000 That's not my place.
00:40:10.000 It's been proven that censorship is not the answer.
00:40:15.000 Look at the history of prohibition.
00:40:18.000 You have digital content, it's substances, it's anything.
00:40:24.000 People want They want the ability to make the decision for themselves.
00:40:28.000 They certainly do.
00:40:29.000 And then the argument on the other side is when people are distributing, and I'm going to use the big air quotes, hate speech.
00:40:37.000 That's when it gets slippery to me because who's to decide what's hate speech and what's not hate speech?
00:40:41.000 I mean, I've seen people make some ridiculous fucking statements about all sorts of people that are inaccurate.
00:40:46.000 And they do that in order to categorize them and pigeonhole them in an easily definable and dismissable characterization.
00:40:55.000 You know, you just decide, hey, that Bill Ottman guy, that guy's a this.
00:40:59.000 Oh, he's a radical that, and he believes in this, so fuck him.
00:41:03.000 And they're like, okay, fuck him, sweep more.
00:41:05.000 And then cancel culture comes in, like, we're going to cancel Bill Ottman.
00:41:08.000 We're not listening to him anymore.
00:41:10.000 You know, he lied to us about his source, or whatever the fuck you're doing.
00:41:13.000 Have you heard of Daryl Davis?
00:41:15.000 No, I have not.
00:41:16.000 Unless I forgot.
00:41:17.000 Daryl Davis is your boy.
00:41:18.000 He's my boy?
00:41:19.000 I haven't met him, but he's my boy.
00:41:22.000 I want him to be my boy.
00:41:23.000 So he is a black man who befriended hundreds of members of the KKK. And he got them all to leave.
00:41:31.000 He got them to leave the KKK? 200 members left.
00:41:34.000 Wow.
00:41:35.000 After he was like, yeah, I'm just going to talk to you.
00:41:39.000 Really?
00:41:40.000 Did you ever see the W. Kamau Bell's show when he visited with those white supremacists?
00:41:46.000 Not that specific one.
00:41:48.000 No, it's really good because he's such a nice guy.
00:41:51.000 He's so easy to get along with that they let the guard down around him.
00:41:57.000 You get to see these people kind of confused that they like this guy.
00:42:03.000 That's why I think initiating human contact via the social networks, that's really important.
00:42:11.000 But, to play devil's advocate, it's one of the worst ways for people to express themselves in a way where you consider other human beings' experiences and feelings and the way they're going to receive what you're saying because there's no social cues, you're not interacting with them, you're not looking at them in the eyes.
00:42:27.000 It's one of the weirder forms of communication between human beings and one that I would argue we have not really necessarily successfully navigated it yet.
00:42:38.000 I agree.
00:42:40.000 I was actually saying that I think we should use social media more to get people to get together in real life.
00:42:46.000 Do you know who Megan Phelps is?
00:42:49.000 No.
00:42:50.000 She was with the Westboro Baptist Church.
00:42:55.000 You know, the famous one that protests those soldiers' funerals and anything gay.
00:43:02.000 They're like ruthlessly, viciously fundamental Christians.
00:43:08.000 They do a lot of protesting at funerals and do a lot of stuff to try to get...
00:43:11.000 She was with them for the longest time and then got on Twitter.
00:43:17.000 And through communicating on Twitter, and when you meet her, you would never believe it in a million years that she was ever this fundamentalist and that she was ever some mean person sending hateful messages to people because their son was gay or whatever it was.
00:43:30.000 Now, she's completely cured of it.
00:43:33.000 She has no contact with the church anymore.
00:43:35.000 She's married.
00:43:36.000 She has a kid.
00:43:36.000 She's completely outside of it.
00:43:38.000 She does a podcast now and gives TED Talks and speaks about radicalization and about how she was kind of indoctrinated and grew up in this family.
00:43:46.000 And her grandfather, Fred Phelps, was this, you know, it's like, it's a fucking mean guy.
00:43:52.000 Like a really mean, he's the God Hates Fags guy.
00:43:55.000 You know, they would have those signs that they would hold up at soldiers' funerals.
00:44:00.000 I mean, it's like really inflammatory stuff.
00:44:02.000 But through Twitter, through her communicating with people on Twitter, specifically her now husband, like, he cured her, like, just with rational discourse and communication, and she was open to it.
00:44:14.000 Yeah, people will change.
00:44:15.000 Yeah, they will change, yeah.
00:44:16.000 And so that's why banning them, I mean, I saw in a recent podcast, you've been talking about redemption.
00:44:24.000 Yeah.
00:44:25.000 So, but I'm curious, do you think people, what is, how does that look like?
00:44:31.000 Well, look, in the case of, like, Megan Phelps, that's a real thing, right?
00:44:37.000 She really did change.
00:44:38.000 Another example is Christian Piccolini.
00:44:41.000 Do you know who he is?
00:44:42.000 He was a white supremacist, KKK member, guy, who's been on Sam Harris' podcast, he's also done some TED Talks, who now speaks out against it and talks about how he's indoctrinated and talks about how lost he was and then he was brought into this ideology.
00:44:59.000 There's many people like that all over the world.
00:45:03.000 Majid Nawaz, another perfect example.
00:45:06.000 He was an Islamist.
00:45:07.000 I mean, he was trying to form a caliphate, was literally thinking about radical Islamic terrorism as being some sort of a solution.
00:45:16.000 Now he's the opposite.
00:45:17.000 Now he's trying to get people to leave, and he's trying to get people to be more reasonable and secular.
00:45:23.000 Did you see what happened to him?
00:45:24.000 Yeah, he got punched in the street.
00:45:26.000 Yeah, some guy called him a fucking Paki, I guess, and punched him in the head and fucked his head up.
00:45:32.000 And he's got this giant cut on his head from a ring and his face is swollen up.
00:45:36.000 But apparently they have the guy on video and they think they're going to be able to arrest the guy.
00:45:41.000 I've had Majid on the show.
00:45:42.000 He's a super nice guy.
00:45:43.000 The hard thing is that Yes, we see these transformations take place.
00:45:49.000 It makes us feel warm inside.
00:45:50.000 And yes, people can change.
00:45:52.000 But at the same time, should people have to go apologize to Twitter?
00:45:58.000 Oh, I'm sorry.
00:45:59.000 Can I come back?
00:46:01.000 Sometimes people are going to think completely differently than you, and you just have to deal with it.
00:46:06.000 And that should be okay.
00:46:07.000 We shouldn't force people to come in to our way of thinking in order to have discourse.
00:46:14.000 No, that's a good point.
00:46:14.000 That's a very good point.
00:46:16.000 And, like, who is to decide what this path to redemption is and whether or not you've completed it?
00:46:22.000 Right?
00:46:23.000 Who is to decide?
00:46:23.000 Like, maybe you are, like, a hyper-radical lefty, and maybe Jamie's points of view and yours are just never going to line up, so you're like, fuck him, he's banned for life, which a lot of people have been banned for life.
00:46:36.000 And when you look at some of the infractions they've been banned for, they're like, boy, I don't know about that one.
00:46:42.000 That doesn't really make sense.
00:46:43.000 Almost none of the high-profile banning cases make much sense.
00:46:47.000 No.
00:46:48.000 It's like a short-term solution that's creating a long-term problem.
00:46:53.000 Yeah.
00:46:53.000 That's really what it is.
00:46:56.000 So, I just think that we have to talk about it more.
00:47:01.000 I don't know.
00:47:02.000 It's like, why can't we just get everyone to talk about it?
00:47:05.000 Yeah.
00:47:06.000 Like, at the same time.
00:47:07.000 I mean, it's like, we're just wasting time here.
00:47:11.000 Well, sort of, but I also think we're figuring it out as we go along with a bunch of different competing ideologies.
00:47:17.000 You know, you have yours, which, like, you, dude, you look like a hacker on, like, House of Cards.
00:47:23.000 You look like a guy you call in to break into the mainframe server.
00:47:28.000 You're not that, honestly.
00:47:29.000 I believe you're not.
00:47:30.000 I hang out on GitLab and check out code, but I cannot code.
00:47:35.000 I cannot code.
00:47:36.000 Listen, man.
00:47:37.000 I'm not claiming to be a developer.
00:47:39.000 No, these people are another level.
00:47:42.000 It is incredible.
00:47:43.000 I understand.
00:47:44.000 Right.
00:47:45.000 Yeah.
00:47:45.000 I get it.
00:47:46.000 Well, that's like if someone says to me, like, you're an MMA fighter.
00:47:49.000 I'm like, I'm definitely not.
00:47:50.000 And they are on another fucking level.
00:47:53.000 I know a little martial arts, but just settle the fuck down.
00:47:57.000 Right?
00:47:57.000 Same kind of thing.
00:47:58.000 I think, though, that your ideology is going to be, your point of view and perspective is going to be very different than maybe someone who's like a radical Marxist.
00:48:09.000 You know, shouldn't they be allowed to post on the site too?
00:48:11.000 Someone who's like an extreme socialist.
00:48:14.000 Someone like AOC. Yeah.
00:48:16.000 You know, someone who thinks that we should give money to people who are unwilling to work.
00:48:21.000 Someone who thinks that we should try to engineer society and tax the top X percent, you know, 70-something percent of their income.
00:48:29.000 There's a lot of those different people, and we have to figure out how to make it so that, well...
00:48:39.000 We have to figure out a way to make it so all the ideas can compete in the marketplace of ideas.
00:48:45.000 All these different ideas can compete, and we can find out which one is better.
00:48:49.000 And we can find out which one is better.
00:48:52.000 You don't always find out which one is better, though, right?
00:48:54.000 You find out which one is most popular.
00:48:55.000 I mean, that's what happened with Hitler.
00:48:57.000 You don't really find out what's better.
00:48:58.000 You find out what's got more juice behind it.
00:49:02.000 It's just, it's too risky.
00:49:04.000 Even being in the position that I'm in, you know, I see these edge cases.
00:49:10.000 Like, we say, look, if it's legal, it can be there, but we still see edge cases where we have to make decisions.
00:49:15.000 Okay, what's like an edge case?
00:49:18.000 I mean...
00:49:21.000 Let's see.
00:49:22.000 I mean, there is...
00:49:25.000 I don't even want to go here, but I will.
00:49:27.000 Uh-oh.
00:49:27.000 There is a type of animation.
00:49:32.000 Uh-oh.
00:49:33.000 Porn anime?
00:49:34.000 Yeah.
00:49:34.000 Yeah.
00:49:35.000 That is very sketchy.
00:49:37.000 Super.
00:49:38.000 You know, like, child porn, animated child porn.
00:49:41.000 And we've taken the stance that, look, it could fall under obscenity laws, so we're not cool.
00:49:47.000 Yeah.
00:49:47.000 But, you know, that is a huge debate.
00:49:50.000 Right.
00:49:50.000 That has not been decided by the Supreme Court if animated, you know, kids, like, they will do the weirdest stuff.
00:49:59.000 And I just don't want to be telling people what is and what is not art.
00:50:05.000 Right.
00:50:06.000 So, like, some of that Japanese stuff with tentacles, like, some of that stuff is just like, what is happening here?
00:50:14.000 Right.
00:50:15.000 I got, like, octopuses banging chicks in every hole, and they're choking on it, and they've got one in their ass and one in their vagina, and it's all, like, very liquidy.
00:50:26.000 You know, there's a lot of splattering going on.
00:50:28.000 You're like, what the fuck is this, and is that okay?
00:50:31.000 Because it's just art, right?
00:50:32.000 I mean, if it was a person getting fucked left, right, and center by an octopus, he'd be like, yeah, I think we've crossed some lines here.
00:50:39.000 That's bestiality.
00:50:40.000 But if it's an image, and then the image is a girl with a schoolgirl costume on.
00:50:45.000 She's dressed like a Catholic schoolgirl with a little skirt, and she's getting banged by an octopus.
00:50:49.000 You're like, what do you do with that?
00:50:54.000 Right?
00:50:54.000 Yeah.
00:50:55.000 What would you do?
00:50:57.000 It's a good question.
00:50:58.000 I'm glad I don't have a social media site where I have to make that decision.
00:51:00.000 Well, the real concern would be, is this something that is actually illegal?
00:51:07.000 That's the thing.
00:51:08.000 And we've tried to look at the case law, and we've seen that this type of stuff has been called obscenity before.
00:51:16.000 And so we're just not going to risk it.
00:51:18.000 But I still, you know, in a...
00:51:22.000 Alright, nipples.
00:51:23.000 Nipples?
00:51:23.000 Look.
00:51:24.000 Right.
00:51:25.000 Did you know that Free the Nipple started out with- On 4chan.
00:51:29.000 Well, everywhere.
00:51:30.000 It's a whole movement.
00:51:32.000 To be honest, Time Magazine just did a really interesting piece about a statue that got banned from Facebook.
00:51:37.000 It was a naked ancient statue that has a nipple.
00:51:41.000 Like, I'm sorry, that's not realistic.
00:51:45.000 That's not helping society, taking down a naked statue.
00:51:50.000 Well, we were talking about the other day, during the Super Bowl, that Adam Levine had his shirt off, and Brian Redman was like, hey, wasn't that what Janet Jackson got in trouble for?
00:52:00.000 Like, yeah.
00:52:01.000 Why is it okay if Adam Levine shows his nipples, and Janet Jackson's nipples are offensive because they're sexualized, because she's a woman?
00:52:08.000 Mm-hmm.
00:52:10.000 This is the weird fact.
00:52:12.000 Men had to gain the right to have their nipples shown in public back in the day.
00:52:19.000 When's the day?
00:52:20.000 If you go on the Free the Nipples site, there's this...
00:52:25.000 Go on their Instagram or something.
00:52:27.000 I think that's maybe where I saw it back when I used Instagram.
00:52:29.000 But...
00:52:31.000 You know, society is evolving.
00:52:33.000 We're going to get there.
00:52:34.000 We're going to be able to handle it, I think.
00:52:36.000 Or give people the controls so that they can only see the types of things that they want to see.
00:52:40.000 That's ultimately what it's about.
00:52:42.000 So, like, you should have, like, a filter.
00:52:44.000 Like, do I want 18 plus?
00:52:45.000 Do I want PG-13?
00:52:48.000 Like, what kind of distinction do I want?
00:52:51.000 Yeah.
00:52:52.000 Yeah.
00:52:52.000 And then when things come up, like one of the things that Instagram has been doing is like they say, I follow a lot of hunters and Instagram has things where they say, warning, this is sensitive content.
00:53:04.000 Nature is Metal gets popped on that a lot too because Nature is Metal is an Instagram site that's all like these crazy images and videos of animals eating other animals and attacking other animals.
00:53:16.000 And sometimes, some of them, they just decide, this one's too fucked up.
00:53:20.000 They just decide.
00:53:21.000 There's one of them where a lion is looking out of a wildebeest asshole from the inside.
00:53:29.000 There's this giant hole they've eaten through its stomach, and it's looking out its asshole.
00:53:33.000 And they're like, yeah!
00:53:34.000 This one, you're going to have to click on your own.
00:53:36.000 You have to double click.
00:53:37.000 What do you got, Jamie?
00:53:38.000 Basically, from what I just looked up, Tarzan is the catalyst for why guys wanted to wear their shirts off.
00:53:44.000 Like in the 1920s, 1910s, they had to wear, in pools, they had to wear a top.
00:53:50.000 But look, this only covers one nipple.
00:53:52.000 They're probably tired or sweaty.
00:53:54.000 That's the rebellion right there.
00:53:55.000 That's how they started doing it.
00:53:57.000 They're pulling down the strap.
00:53:57.000 Look what it says here.
00:53:58.000 Saucy lifeguards flash rebellious nipples.
00:54:01.000 Ha!
00:54:03.000 It got overturned in 1937. That's hilarious.
00:54:09.000 So it was Tarzan, 1937, New York State's male shirtless ban.
00:54:14.000 That's when they overturned it.
00:54:15.000 The incident attracted press attention as Atlantic City and other waterfronts similarly mandated against man-nips.
00:54:22.000 With that legal domino tipped, along with the help of Hollywood hunks.
00:54:28.000 And you were talking about how Twitter has porn.
00:54:31.000 Yes.
00:54:31.000 A lot.
00:54:32.000 Yo, we got banned from Google Play for that.
00:54:36.000 Twitter has it.
00:54:37.000 They're up on Google Play.
00:54:38.000 Yeah, Twitter has a substantial amount of porn.
00:54:42.000 You know, you follow, like, some of them gals, and they just want you to see, look, here's one in my pussy.
00:54:47.000 Right there.
00:54:48.000 Take a look.
00:54:49.000 Like, full-blown.
00:54:50.000 It's not offensive.
00:54:52.000 Well, it's not offensive if you follow them.
00:54:54.000 If you follow certain porn stars, you know what you're going to get.
00:54:56.000 I think it's against their own terms.
00:54:58.000 Oh, really?
00:54:58.000 But they're just allowing it because they know they want that traffic.
00:55:01.000 Oh, is that what it is?
00:55:02.000 You know they want that traffic.
00:55:03.000 Oh, you know they want that traffic, bro.
00:55:06.000 Yeah, it's a problem if you hand your phone to your kid.
00:55:10.000 You know, they accidentally click on that link and they're like, Mommy, what's happening with her?
00:55:15.000 But Jonathan Haidt, Haidt or Haidt?
00:55:18.000 Height?
00:55:19.000 Height.
00:55:19.000 He was talking about an interesting thing where, you know, should there be an age where we really get into social media?
00:55:29.000 I don't know.
00:55:30.000 I mean, people should be free to do what they want to do, but, you know, the internet is the wilderness, right?
00:55:38.000 Well, his book, The Coddling of the American Mind, I'm in that right now.
00:55:43.000 I just finished his other one and I'm working on that one.
00:55:46.000 And a lot of it has to do with social media and a lot of it has to do with the impact that it has on young people.
00:55:52.000 You know, people are not really designed for this.
00:55:55.000 And you might be able to handle it if you're a 32-year-old man or a 35-year-old woman or whatever you are.
00:55:59.000 But if you're a 15-year-old girl...
00:56:02.000 It might be overwhelming.
00:56:04.000 I mean, and the angst and the anxiety and, you know...
00:56:09.000 That's what I was saying about the depression.
00:56:11.000 You know, they see if they're not at a party where their stuff's not getting liked.
00:56:15.000 That has an impact on them.
00:56:17.000 And ultimately, I think the networks need to be helping educate people...
00:56:22.000 How to, you know, whether it's disinfo, educate people how to research.
00:56:26.000 I did see that YouTube is starting to do, like, a you've been on this for too long type thing.
00:56:31.000 Really?
00:56:32.000 Yeah.
00:56:32.000 Like, get a life, you fuck.
00:56:34.000 Yeah.
00:56:35.000 They tell you that?
00:56:35.000 I want to build stuff like that.
00:56:37.000 That's really important.
00:56:38.000 Yeah.
00:56:39.000 Helping people get off.
00:56:41.000 Yeah.
00:56:42.000 I haven't seen that.
00:56:44.000 I haven't done enough time on YouTube where they're kicking me off.
00:56:48.000 I have.
00:56:49.000 Yeah.
00:56:50.000 It's easy.
00:56:51.000 You know, I sent Eddie Bravo this thing from The Guardian about the upsurge in people that believe in the flat earth and all of it because of YouTube videos and that apparently now YouTube is, they want to censor those.
00:57:07.000 They want to, they feel like Flat Earth videos and I think another one, check this if I'm wrong about this, but I think they also want to lean on those anti-vaccination videos.
00:57:21.000 I think there's a concern with those.
00:57:24.000 I think they're worried about a bunch of different things along those lines.
00:57:28.000 They feel like there's disinformation and outright lies that are being spread.
00:57:32.000 How do we combat it?
00:57:33.000 We own this platform.
00:57:34.000 What do we do?
00:57:35.000 They feel like they have a responsibility.
00:57:36.000 I think there is responsibility.
00:57:38.000 Okay, but what is a responsibility if there's a debate?
00:57:41.000 I think it's more to educate people how to research as opposed to saying this is or is not true because who's deciding that?
00:57:49.000 Well, I believe the earth is round.
00:57:52.000 But, uh, um, I also believe it's such a stupid conspiracy that you should have it.
00:58:00.000 You should be allowed and it should be something you should show your friends.
00:58:04.000 Like, dude, I need you to go look at this.
00:58:06.000 This has 37,000 thumbs up.
00:58:16.000 Yeah.
00:58:30.000 Of course.
00:58:31.000 But I think freedom of information sort of transcends a lot of these little debates.
00:58:37.000 So if there was more freedom of information, so we actually knew everything the government knew about all of the different conspiracies and black projects, the black budget.
00:58:47.000 Yeah.
00:58:49.000 More information is going to give both sides the ability to understand what is happening.
00:58:54.000 That's true.
00:58:55.000 The reality is that we don't know what's happening, and there is lots of secret stuff.
00:58:59.000 The problem with that, though, is then you're dealing with foreign governments that are way better at keeping secrets than we are, and if they have access to our secrets.
00:59:07.000 One of the things that's been kind of disturbing is seeing the actual influence that these Russian troll farms have had on not just our political process, But sowing seeds of dissent amongst people and starting conflict amongst people and how people are buying into it.
00:59:25.000 You know, like this podcast I've been talking about a lot with Sam Harris and Renee DiResta, that's her name, right?
00:59:31.000 Where they talked about how these Russian troll farms set up a conflict by having a pro-Muslim rally across the street from a pro-Texas Pride rally.
00:59:43.000 And they just set it all up and had it there and then a skirmish broke out.
00:59:47.000 Because these people are across the street from each other.
00:59:49.000 And that they do this with – they were having these African-American groups that were saying anyone but Hillary, and they were really trying to get people to vote for Jill Stein, really trying to get people to even consider Trump anyone but Hillary.
01:00:03.000 And then they were also having ones that were against them.
01:00:06.000 They're trying to make debate.
01:00:09.000 They're trying to make anger.
01:00:11.000 I don't think you can stop that.
01:00:12.000 But it's a fascinating thing, isn't it, that this is like a concerted effort?
01:00:17.000 How do you feel about that?
01:00:19.000 When you're in a position where you have a fairly small network, but it's influential, right?
01:00:25.000 And then so you're watching Zuckerberg and the Facebook shit on TV, and they're talking to these congresspeople and senators, and they're talking to all these politicians about what's going on and how to stop it and what they're trying to do, and you feel like, oh God, this is an arena that I'm getting into.
01:00:42.000 What would you do?
01:00:44.000 I mean, I think more conversation needs to happen, not less.
01:00:49.000 Yeah, I think you're right.
01:00:50.000 I want more information from the government, from the corporations.
01:00:54.000 From the trolls?
01:00:55.000 From the trolls.
01:00:57.000 I mean, I feel like I have a pretty good ability to discern what is and is not troll behavior.
01:01:07.000 I think help people understand how...
01:01:11.000 How to absorb information.
01:01:14.000 Just banning an account that has an agenda Everyone has an agenda.
01:01:22.000 It's a propaganda back and forth between everybody.
01:01:26.000 Just because somebody posts a Jill Stein meme, okay, what's your point?
01:01:37.000 I'm not saying that regime change behavior is...
01:01:44.000 Positive or negative.
01:01:45.000 I don't know how we sort of switched gears.
01:01:46.000 No, we did, but let me step in here.
01:01:49.000 When you're saying a Jill Stein meme, there's absolutely nothing wrong with you posting a Jill Stein meme.
01:01:55.000 Like, say, if you have a joke about Jill Stein, you wanted to post it in a meme.
01:01:58.000 There's nothing wrong with that.
01:01:59.000 What's weird for people is that people are being hired to make these memes, and these memes may not have anything to do with their own personal ideology.
01:02:07.000 They might just decide, hey, I'm going to collect this check And they make, apparently according to Renee in this podcast she did with Sam Harris, they make really hilarious memes.
01:02:17.000 Like some of them are really funny.
01:02:18.000 I listened to that podcast, yeah.
01:02:19.000 It was great, right?
01:02:20.000 She said that she started laughing a couple times.
01:02:22.000 Yeah, yeah, yeah.
01:02:22.000 And she had to go through thousands and thousands of them.
01:02:27.000 That's weird, right?
01:02:28.000 There's this idea of a web of trust, which is interesting.
01:02:31.000 Sort of like a peer-to-peer.
01:02:32.000 It's not like a Chinese social score.
01:02:35.000 But it's like, the people that you're connected with...
01:02:40.000 Show a certain account to be untrustworthy because you trust your little network.
01:02:47.000 So it's sort of like a peer-to-peer score.
01:02:50.000 We're looking at different ideas.
01:02:52.000 I think that transparency and understanding what's going on with different accounts and if it's the real person, that's all important stuff.
01:03:03.000 We don't want frauds.
01:03:05.000 We don't want disinfo.
01:03:09.000 We just have to really step back and think about how we're doing it rather than letting AI and algorithms run the show.
01:03:16.000 Right.
01:03:17.000 I see what you're saying.
01:03:18.000 Do you think that there's a, I don't want to say there's a market, is there a demand for this?
01:03:24.000 Like are a lot of people responding in a positive way to the way you guys are approaching the game?
01:03:29.000 Yeah, for sure.
01:03:30.000 Every time there's a big scandal, every time, whether it's data manipulation or our first big growth spurt was during the Snowden days when he released all the information.
01:03:44.000 People are really upset with what's happening.
01:03:48.000 It's just...
01:03:50.000 What are they supposed to do?
01:03:51.000 This is what they're using for their communication.
01:03:54.000 It's not easy to just achieve a multi-billion person network overnight so that everybody's there.
01:04:00.000 And so we're stuck.
01:04:02.000 But again, I think that supplementing, just installing these alternative apps, not just us, like the whole open source market.
01:04:10.000 I'm not even here trying to just talk about what we're doing.
01:04:14.000 It's like, if you don't have those apps on your phone and you don't use those browsers, I'm sorry, you just, you're not helping.
01:04:22.000 And people just want to vote with their energy, I think, and vote with their time.
01:04:27.000 So it's more of an education thing.
01:04:30.000 People just don't know that this matters and that this can help change the whole internet simply by logging into an app once in a while.
01:04:38.000 It's like organic food.
01:04:40.000 I mean, we want to put things into it.
01:04:43.000 We want to support things that have integrity.
01:04:46.000 So when you click something, you are supporting that thing.
01:04:51.000 When you're sitting on an app all day, you are feeding that app.
01:04:54.000 That's how the apps get all the money.
01:04:55.000 That's where they get all their funding.
01:04:58.000 That's where it's all based, is in user retention and energy.
01:05:04.000 And do you think that most people are even aware of this, or do you think they're just using it because it's convenient?
01:05:09.000 The biggest charade going on right now, and most people don't know that Facebook owns Instagram.
01:05:14.000 They think Instagram is cool because it's not Facebook.
01:05:19.000 Right, it's one giant umbrella.
01:05:23.000 What is the difference between the two?
01:05:26.000 Obviously, with Instagram, it's just pictures mostly and then whatever the post is below the pictures.
01:05:31.000 But with Facebook, it's a lot more commentary and long, verbose statements on shit and then people arguing in the comments about it.
01:05:43.000 Yeah.
01:05:43.000 All the Instagram founders left, abandoned ship.
01:05:47.000 The WhatsApp founders abandoned ship.
01:05:49.000 The Oculus founder abandoned ship.
01:05:51.000 All because of the privacy stuff.
01:05:53.000 Really?
01:05:53.000 They're like, you took this good thing.
01:05:55.000 Well, it was proprietary, so I would argue if it was ever actually a fully good thing.
01:06:00.000 But at least it wasn't completely corrupted by Facebook.
01:06:03.000 But all of the founders of those companies left because they hate what's going on.
01:06:07.000 The WhatsApp guy joined Signal, which is a really cool open source, end-to-end encrypted messaging app.
01:06:16.000 So, you know, these people know...
01:06:18.000 So what happened with WhatsApp?
01:06:19.000 It's not the same anymore?
01:06:21.000 WhatsApp is owned by Facebook.
01:06:24.000 I don't know this.
01:06:25.000 Oh, sorry.
01:06:26.000 Come on, bro.
01:06:27.000 Get in the jungle, man.
01:06:29.000 Well, you're deep into this, man.
01:06:31.000 That's why I want to talk to you about it.
01:06:34.000 Yeah, and they're all buying up companies and using these same sort of ideas.
01:06:39.000 Yeah, and now they're talking about integrating the messages between WhatsApp, Instagram, and Facebook, so it's all one system.
01:06:46.000 Ooh.
01:06:46.000 Yeah.
01:06:47.000 Yeah.
01:06:48.000 Centralization.
01:06:49.000 Yeah.
01:06:51.000 It has happened?
01:06:52.000 It hasn't been turned on yet, but I believe it's happening.
01:06:55.000 Like if you want to message your DMs on Instagram, you're going to have to download the Facebook Messenger.
01:06:59.000 So what are the challenges for something like Mines when you're trying to take off?
01:07:05.000 Like there was a social media Instagram type thing that was around a little while ago.
01:07:12.000 Remember I used it like once and I posted about it and What was that called?
01:07:17.000 Vero?
01:07:18.000 Is that what it was?
01:07:19.000 But then a lot of people were saying it was bullshit.
01:07:22.000 They're proprietary.
01:07:24.000 It's closed source.
01:07:26.000 No idea of what's going on behind the scenes.
01:07:28.000 Same as the other ones.
01:07:28.000 A lot of apps try to say that they're alternatives and that they support X, Y, Z privacy or free speech or whatnot.
01:07:38.000 I don't think it's any new paradigm if they're not showing their source code so that people can see the algorithms.
01:07:44.000 The people who care, you know, obviously most people aren't going to go and inspect the code.
01:07:48.000 Right.
01:07:48.000 But just the principle that the experts could, because they will.
01:07:52.000 You know, there's all kinds of think tanks and whatnot that would love to dive into the source code to understand how these companies were actually behaving.
01:08:00.000 So, you know, waving the privacy flag without being open source or...
01:08:07.000 This is getting a little bit into the weeds, but a lot of this comes down to licensing of content or code.
01:08:13.000 So the license that we use for our code is called the general public license, the AGPL v3, which means that anyone can take our code and do whatever they want with it.
01:08:24.000 They can sell it.
01:08:25.000 They can do anything.
01:08:26.000 But if they make changes, they have to show them with everybody else.
01:08:30.000 So it's sort of like the Creative Commons share-alike license, which essentially says the same thing.
01:08:36.000 Take my video, photo, remix it, do whatever you want, but you have to share the result with everybody else.
01:08:42.000 Open source basically means you can do whatever you want with it.
01:08:46.000 You can take it, make it your own, keep your own little secret sauce if it makes you feel good.
01:08:53.000 They get conflated because free software sounds like Free as in free beer, not free as in freedom.
01:09:03.000 So, you know, licensing is really what this all coalesces into.
01:09:12.000 But it's been proven that you can make a lot of money with free and open source software.
01:09:18.000 I mean, look at WordPress.
01:09:19.000 It's a hugely successful technology corporation, multi-billion dollars.
01:09:26.000 People share the code.
01:09:27.000 It created a network effect because they did that.
01:09:30.000 It's like the Grateful Dead would let everybody record their music and that's how it spread.
01:09:35.000 So it's actually a good marketing tactic and it also gives transparency so people can see what the hell is going on.
01:09:41.000 Right.
01:09:42.000 Now, when you started this, what was your objective?
01:09:47.000 And were you thinking about it as a potential large-scale source of revenue?
01:09:52.000 Or were you just thinking, this is something that I would like to do and do correctly because I don't think anybody's doing it this way?
01:09:58.000 Open source, pro-freedom of speech, anti-censorship, and to just do the bare minimum amount of managing content.
01:10:10.000 Right.
01:10:12.000 I think everyone should be able to make money.
01:10:15.000 I don't think it should have to be mutually exclusive like you do something for free for everybody and you also can't make money.
01:10:21.000 That's a big misconception.
01:10:23.000 We're trying to give people the tools to make money.
01:10:27.000 We have a monthly recurring subscription system, sort of like a crypto Patreon-type tool, so you can subscribe to people.
01:10:35.000 We had the ability for creators to accept fiat dollars, but we took it out because it's Stripe, and Stripe is a closed-source...
01:10:47.000 Which we just didn't have long-term faith in.
01:10:50.000 So Stripe was some sort of an extension to your site?
01:10:54.000 Yeah, we were using their API to facilitate peer-to-peer payments.
01:10:57.000 This is why Patreon most likely banned Carl because the payment processors went to them and were like, look.
01:11:11.000 You know, Stripe has very strict terms.
01:11:13.000 And we didn't want to be, you know...
01:11:18.000 We don't want to be subject to overlords in our company decisions.
01:11:23.000 Do you think that's what happened with Carl?
01:11:26.000 That is most likely what happened.
01:11:29.000 So they stepped in and said, hey, we don't want this guy to be a part of the site.
01:11:32.000 Yeah, and I think that Stripe probably stepped in with a lot of explicit content, controversial content.
01:11:38.000 I mean, it's in their policy that you can't facilitate payments dealing with that type of content.
01:11:42.000 And now we're seeing banks actually go after people.
01:11:45.000 Well, the thing about Carl, though, was that his content, that it was questionable, wasn't even related to Patreon.
01:11:52.000 It had nothing to do with it.
01:11:53.000 It was on another person's podcast.
01:11:54.000 It was from six months prior, and that other person's podcast was on YouTube.
01:11:58.000 It had nothing to do with Patreon, and they had specifically said that they were not going to act on content that was outside of their network.
01:12:06.000 They were only going to react to things that were on Patreon.
01:12:10.000 Right.
01:12:10.000 Because you can make little blogs and stuff on Patreon, right?
01:12:13.000 They do have some content.
01:12:14.000 Yeah.
01:12:15.000 Yeah.
01:12:16.000 But that's not what Patreon said.
01:12:18.000 Obviously, the processors...
01:12:19.000 I don't want to...
01:12:20.000 Who knows?
01:12:21.000 Yeah.
01:12:21.000 I don't want to act like I know.
01:12:23.000 I also don't want it to seem like I have an ideology that I'm trying to push right now.
01:12:27.000 Like, I'm very open to moving in the direction that makes the most sense for the community.
01:12:34.000 I'm not attached to what I'm thinking.
01:12:38.000 Mm-hmm.
01:12:38.000 Good.
01:12:39.000 I like that.
01:12:40.000 I wish more people would do that.
01:12:43.000 I mean, I try to do that.
01:12:44.000 I'm really getting way better at it.
01:12:47.000 But that's something I actively work on.
01:12:50.000 Like, these ideas that I have, I'm not fucking married to them.
01:12:53.000 Don't argue them.
01:12:54.000 Look at them.
01:12:56.000 If someone says something different, go, huh.
01:12:58.000 Don't go, no man, that ain't right, bro.
01:13:01.000 Because that natural instinct to argue and to claim some sort of a personal identity with your ideas, that's part of the problem that we have.
01:13:12.000 I think it's a main conflict issue with social media.
01:13:17.000 I think?
01:13:32.000 I might correct someone if someone said something that's incorrect, but I'm not going to argue, and I'm not going to insult.
01:13:38.000 I'm just not.
01:13:40.000 It doesn't even work.
01:13:41.000 It doesn't work.
01:13:42.000 It just makes people argue back and insult you back, and nothing ever gets accomplished.
01:13:47.000 Occasionally, you dunk on people, and it's fun.
01:13:50.000 But in reality, especially me, I kind of dunk on people for a living, so I'm just going to...
01:13:56.000 I'm not going to engage.
01:13:57.000 And I don't...
01:13:58.000 This is going to sound corny as fuck.
01:13:59.000 I don't want to hurt anybody's feelings.
01:14:01.000 I really don't.
01:14:02.000 I don't want to be in some argument where someone is looking at their phone like, fucking fuck you!
01:14:06.000 Fuck you!
01:14:07.000 I don't want that.
01:14:07.000 I don't want that.
01:14:08.000 I get that.
01:14:09.000 I know what it is.
01:14:10.000 I know what it is.
01:14:11.000 But it's...
01:14:13.000 In this flat medium, okay?
01:14:16.000 This two-dimensional medium of typing text and then sending text and you type text and send text...
01:14:23.000 The conflict that arises through that is never beneficial, in my opinion.
01:14:28.000 I don't get anything out of it.
01:14:30.000 So, if I'm expressing something, almost always I try to express something about shit I like.
01:14:36.000 Like, oh, I love this new show.
01:14:38.000 Oh, this movie was great.
01:14:39.000 Oh, this is amazing.
01:14:41.000 Like, check out this picture.
01:14:42.000 Yeah, you might want to check this out.
01:14:44.000 It's tone.
01:14:46.000 Honestly, same with me.
01:14:48.000 I was much more...
01:14:50.000 Trying to convince people about what I thought was right.
01:14:54.000 Coming out of college, you think you're all high and mighty.
01:14:57.000 It doesn't work.
01:14:58.000 People just are allergic to it.
01:15:01.000 I'm allergic to it.
01:15:02.000 I cannot handle it.
01:15:04.000 No one wants to talk like that.
01:15:06.000 It's one thing if you're having a good time and trying to show someone up.
01:15:10.000 You can have fun with it.
01:15:12.000 It's more comedic.
01:15:13.000 But when you're actually taking yourself seriously, it's not going to work.
01:15:18.000 No, it's not going to work.
01:15:20.000 And it actually has the exact opposite effect.
01:15:23.000 It's like the expression...
01:15:26.000 How's the expression?
01:15:30.000 Jealousy is like a poison that...
01:15:33.000 How does it go?
01:15:34.000 Jealousy is like a poison that you take yourself because you don't like what someone else is accomplishing.
01:15:40.000 I forget that terrible job of paraphrasing that.
01:15:43.000 That might be my worst paraphrasing of all time.
01:15:46.000 Mumble mouth motherfucker that I am.
01:15:48.000 But the idea is that it has the exact opposite effect.
01:15:52.000 Like if you're jealous about someone, it actually makes you feel bad instead of them feel bad.
01:15:56.000 It also makes them not want to hang out with you.
01:15:57.000 Oh.
01:15:58.000 Well, you know, they probably don't want to hang out with you anyway, let's be honest.
01:16:00.000 But what you're doing by back and forth, and I know people who do engage in it, and sometimes they have these anxiety moments where they don't sleep for days because they're involved in these Twitter feuds.
01:16:17.000 I mean, I know people that have done this, where they've gotten involved in Twitter feuds, and they'll wake up at 3 o'clock in the morning, they check their Twitter feed, and like, oh, Christ, man.
01:16:24.000 Like, you gotta go on a yoga retreat or something.
01:16:27.000 You can't do this.
01:16:28.000 You can't live your life like this.
01:16:30.000 I think there may be some value to the debate.
01:16:37.000 It should be there.
01:16:38.000 It should be there.
01:16:39.000 Debate.
01:16:39.000 And it's like, okay, I'm not going to spend my time doing it that way.
01:16:43.000 Some people want to spend their time doing it that way.
01:16:45.000 And if there's cool mechanisms for the most voted content to be seen, I mean, okay, that's interesting to check out sometimes to look at feedback.
01:16:54.000 Yeah.
01:16:56.000 It's not nearly as an effective way of communicating your ideas as making something more personal.
01:17:05.000 Even video is more effective than that because people actually have a chance to look at you or obviously in person would be even better.
01:17:14.000 Yeah, well, in-person is obviously the best.
01:17:17.000 And I think my concern, really, about the future is...
01:17:21.000 I'm holding back a sneeze right now.
01:17:22.000 Sorry.
01:17:24.000 Trying to keep it together.
01:17:25.000 Do it.
01:17:25.000 I don't think I can.
01:17:27.000 It's one of those borderline ones.
01:17:29.000 What are you supposed to do?
01:17:30.000 Are you supposed to stare at the light?
01:17:32.000 Are you trying to resist it?
01:17:34.000 No, I'm trying to get...
01:17:35.000 Okay, we're good.
01:17:36.000 We're out of the woods.
01:17:38.000 I lost my train of thought.
01:17:39.000 What were we just saying?
01:17:40.000 AI something?
01:17:42.000 No.
01:17:43.000 Jealousy quote?
01:17:44.000 You're talking...
01:17:44.000 We went way past that, Jamie.
01:17:46.000 I know, but you guys are back and forth.
01:17:46.000 You've been asleep for days.
01:17:47.000 I was looking for it, and then I was...
01:17:50.000 I lost it.
01:17:51.000 I lost it in my holding back a sneeze.
01:17:55.000 Oh, that's what I was worried about.
01:17:57.000 AI. Not AI. Augmented reality.
01:18:00.000 That's what I'm really worried about.
01:18:01.000 Not artificial, but augmented.
01:18:02.000 And my concern is that what we're experiencing right now in this flat form of two-dimensional text is something that is very overwhelming to a lot of people's time.
01:18:13.000 I mean, you're looking at some kids that are online, social media, eight, ten hours a day just staring at their phones.
01:18:19.000 I'm extremely concerned, and I have some jokes about it in my act, about the next wave, because I think that we're overwhelmed by this incredibly attractive medium where we're attracted to our phones, we're attracted to this style of engaging in information and receiving information and passing information and online arguments and debates and looking at pictures and this constant stream,
01:18:45.000 which, you know, Just looking at your phone, it's not that thrilling.
01:18:51.000 It's just like, hmm, it's not that thrilling.
01:18:53.000 It's like, okay, yeah, but it's still getting you all day long.
01:18:56.000 Like, there's nothing really crazy happening.
01:18:59.000 When my concern is if something really crazy does start to happen.
01:19:03.000 When you really can have experiences that are hyper-normal, like that are more powerful than anything you can experience in this regular carbon-based physical touch-and-feel world.
01:19:15.000 And once we start experiencing augmented reality, the integration between humans and technology, and then the ability to share augmented reality.
01:19:27.000 If you were at work and you have these fucking goggles on and your girlfriend is at work on the other side of town and you guys both have these similar video pets that are with you and dancing around and providing you with fucking advertisements and giving you things,
01:19:44.000 there's next levels to this stuff that I'm trying to see the future, but I'm too fucking stupid and I don't really know anything about technology, but I know that they're going to get deeper into our lives.
01:19:57.000 I know that these technologies, not they like the government, but these technologies, they're going to get deeper into your life.
01:20:03.000 And that they got you by the balls and the clit with a fucking phone.
01:20:08.000 And it doesn't even do much.
01:20:09.000 Take some pictures, look at some pictures, look at some text, watch some videos.
01:20:14.000 That's all it does.
01:20:15.000 And access to most human knowledge.
01:20:18.000 That's true.
01:20:18.000 But how many people are using that?
01:20:20.000 Well, you know, they are for sure.
01:20:22.000 They definitely are.
01:20:22.000 There's a lot of Googling going on.
01:20:24.000 I'm sorry.
01:20:25.000 What is the other one?
01:20:26.000 DuckDuckGo.
01:20:27.000 DuckDuckGoing going on.
01:20:28.000 I'm holding out for another one.
01:20:31.000 We might start working on search more.
01:20:33.000 Bing is a goddamn ghost town, isn't it?
01:20:35.000 I bet if you go to Bing, you gotta blow fucking dust off your keyboard as soon as you open it up.
01:20:40.000 Like, no one's in there.
01:20:42.000 Who's in Bing?
01:20:43.000 Bing is just Microsoft.
01:20:45.000 I know.
01:20:45.000 But who's using that?
01:20:47.000 Oh, ladies?
01:20:48.000 I think that YouTube is the number two search engine on the web.
01:20:53.000 YouTube?
01:20:54.000 YouTube, yeah.
01:20:55.000 Wow.
01:20:55.000 Yeah.
01:20:56.000 Really?
01:20:56.000 That's why people can make so many videos about so many weird topics and it'll just pop up and you get...
01:21:01.000 I don't think we can stop it.
01:21:03.000 Holy shit.
01:21:03.000 But...
01:21:06.000 Look, it would be fun with the frequency that you go to an arcade.
01:21:12.000 I would go and do some crazy AR, VR stuff.
01:21:16.000 It would be fun as a rare entertainment thing to do.
01:21:21.000 I just want to make sure that even with the robots that we're carrying around now, Is it respecting my freedom?
01:21:28.000 Is this thing on my side?
01:21:30.000 It's not.
01:21:31.000 I don't think it is right now because I'm using Android as open source.
01:21:34.000 Are you an Android guy?
01:21:35.000 Of course you are.
01:21:37.000 All those crypto guys, they're all Android people.
01:21:41.000 It's just more freedom.
01:21:44.000 Now, Google's version of Android is just as bad as iOS.
01:21:48.000 So whose version of Android do you use?
01:21:50.000 I am...
01:21:52.000 Yeah.
01:21:52.000 No, I'm not perfect, man.
01:21:54.000 Are you not telling me?
01:21:54.000 No, I'm not perfect.
01:21:55.000 I'm on the Google Android right now, but there's a version called Replicant, which is a fully free version of Android that I'm probably...
01:22:03.000 Because I just cracked my phone like a day ago, so I might...
01:22:05.000 You have to get a new one?
01:22:05.000 What are you using?
01:22:06.000 What phone do you use?
01:22:08.000 S8. Oh, look at you.
01:22:10.000 Yeah.
01:22:10.000 You're kind of retro.
01:22:11.000 Is it?
01:22:12.000 I don't know.
01:22:12.000 It took a year ago.
01:22:13.000 There's this one called the Black Phone, which I'm looking into.
01:22:16.000 What's that?
01:22:16.000 It's like a hyper-encrypted phone, but I don't know if it's fully free.
01:22:22.000 Wasn't there a blockchain-based phone that they were coming out with?
01:22:25.000 An Ethereum-based phone?
01:22:27.000 Isn't that?
01:22:27.000 I don't know.
01:22:28.000 Wasn't that something, Jamie?
01:22:29.000 Yeah.
01:22:30.000 Sure, it definitely got announced, but I don't know that it's still in development.
01:22:33.000 You can't run everything on a blockchain.
01:22:36.000 No.
01:22:36.000 Blockchains are pretty slow.
01:22:39.000 We use it even to publish to the Ethereum blockchain when you send each other payments on mines.
01:22:46.000 It costs, like, you know, there's a gas fee.
01:22:49.000 So the way that the network is powered is that, you know, the miners get paid with gas, with a little bit of ether.
01:22:55.000 So it costs like a buck to do a post.
01:22:57.000 Like, there's fully decentralized social networks.
01:22:59.000 There's one called PPeth, which you have to pay for everything you do on it.
01:23:04.000 And so this is why it's a cool experiment, but it's really not scalable.
01:23:08.000 So, you know, it's going to be a combination of decentralized technology.
01:23:12.000 Like, Not just blockchain.
01:23:15.000 People like to say that blockchain is going to solve all the problems, and it's going to solve a lot of problems.
01:23:20.000 It's an incredible tool.
01:23:24.000 What is this, Jamie?
01:23:25.000 Is this it?
01:23:26.000 The Finny?
01:23:27.000 It's one that's out now.
01:23:28.000 Yeah, it just went on sale like a month ago or something.
01:23:31.000 It's pretty.
01:23:32.000 Siren OS, which I'm not exactly sure.
01:23:34.000 What is that, Jazz?
01:23:35.000 Good luck getting a fucking app with that.
01:23:38.000 And some of those Android apps, they're sneaky, right?
01:23:41.000 Don't they steal Bitcoin?
01:23:43.000 There was an Android app that got in trouble for stealing cryptocurrency.
01:23:48.000 It was stealing it in the background while you had your app open.
01:23:51.000 And it was on the Google Play Store.
01:23:53.000 See if that's true.
01:23:54.000 I might have made something up.
01:23:55.000 I can get sued.
01:23:56.000 I don't think I did, though.
01:23:57.000 It might have been mining.
01:24:00.000 And I think it was stealing.
01:24:01.000 It says steals, yeah.
01:24:02.000 Yeah, pull that up so we can see it.
01:24:05.000 I have an Android phone as well.
01:24:07.000 I have a Note 9. I really like it.
01:24:09.000 It's giant.
01:24:10.000 Huge screen.
01:24:10.000 Great battery life.
01:24:11.000 Beautiful.
01:24:12.000 So you use both?
01:24:13.000 Yes.
01:24:14.000 Bitcoin scam warning over fake Android app that steals cryptocurrency from your phone.
01:24:18.000 Yeah, I use both.
01:24:20.000 Android's very good now.
01:24:21.000 It's very good.
01:24:22.000 I was an early adopter and it was like clunky and shitty and then I would go to my iPhone and I was like, oh my god, this is so much better.
01:24:29.000 What iPhone is great with is integration with like Apple TV, integration with a laptop, but I also have a Windows laptop that I use a lot.
01:24:36.000 I really like...
01:24:37.000 I have a Lenovo ThinkPad for writing.
01:24:40.000 The keyboard's better.
01:24:42.000 In fact, I actually bought an older MacBook just for the keyboard.
01:24:46.000 Because as a writer, you want tactile feedback as you're writing.
01:24:51.000 It just helps.
01:24:51.000 It makes it easier for you to recognize where the keys are.
01:24:55.000 And Apple has decided to go so far towards design and just for aesthetic beauty that they've ruined the tactile feedback of their keyboards.
01:25:06.000 Do you remember that, though, when the old smartphones, they still had the keyboard?
01:25:11.000 I thought I would never leave that because it was tactile, but then I ultimately left.
01:25:16.000 That's true, but that's a different experience.
01:25:18.000 That's just thumbs.
01:25:20.000 I can do that with my thumbs and I kind of know where everything is and I'm not writing a novel.
01:25:24.000 You know, when I'm writing material or essays or something like that, I need a fucking keyboard.
01:25:29.000 You don't think that the holographic screen that's just here, you don't think if it just like autocorrects everything you do and you can just like...
01:25:36.000 Maybe, but there's a...
01:25:38.000 I like the tactile, too.
01:25:39.000 There's a feeling.
01:25:41.000 I like mechanical keyboards, in fact.
01:25:43.000 There's a feeling of knowing.
01:25:46.000 Did you test out that FaceTime bug?
01:25:50.000 Did you hear about that?
01:25:51.000 FaceTime bug?
01:25:52.000 Yeah, there was a FaceTime bug where...
01:25:53.000 What was it?
01:25:55.000 You didn't see that?
01:25:55.000 I heard about it, but I didn't look into it at all.
01:25:58.000 Yeah, we didn't test it out and play with it.
01:25:59.000 Basically, you could call someone and hear them without them picking up.
01:26:03.000 Yeah.
01:26:03.000 Without them picking up?
01:26:04.000 Without them picking up, yeah.
01:26:05.000 So, the thing is ringing on FaceTime, and they don't even have to pick up, and then you're on the other end talking shit about them.
01:26:13.000 Yeah, you can basically surveil anybody.
01:26:14.000 Fuck Bill and fuck minds.
01:26:16.000 That guy's full of shit.
01:26:17.000 As soon as the big companies come to him, he's gonna stick his ass in the air, just like all of them.
01:26:21.000 He said the camera could be turned on, too.
01:26:22.000 Yeah.
01:26:23.000 Makes sense.
01:26:24.000 I was thinking about that when I'm beating off.
01:26:26.000 Don't you?
01:26:27.000 You should.
01:26:30.000 Apple acts like it cares about privacy, which maybe it doesn't turn CERN over things to the FBI. I don't know exactly what's...
01:26:38.000 But we don't know what the Apple phones are doing.
01:26:41.000 Right.
01:26:42.000 Apple is all locked down, closed source.
01:26:45.000 And additionally, there was a creepy speech that Tim Cook just gave.
01:26:50.000 Creepy?
01:26:51.000 Did you see it?
01:26:51.000 No.
01:26:52.000 Let's listen to it.
01:26:53.000 Yeah, let's listen to it.
01:26:53.000 Should we play spooky music in the background?
01:26:55.000 Do the ADL speech.
01:26:56.000 Do you have any spooky music you can play in the background while he's doing the speech?
01:27:01.000 We might get in trouble for that.
01:27:03.000 So, again, it's good intentions.
01:27:06.000 Like, people who want less hate speech, we all want less hate speech, realistically.
01:27:11.000 Of course.
01:27:11.000 We want people to get along better.
01:27:13.000 Yeah.
01:27:14.000 Right.
01:27:14.000 But this idea that, I don't want to give too much away, but, you know, he's acting as if they are going to be the moral authority about the types of content that can exist on the App Store.
01:27:28.000 Yeah.
01:27:30.000 So, I just don't know how that's scalable.
01:27:33.000 Yeah, what does that mean?
01:27:35.000 Let's hear what he says.
01:27:35.000 I was just hoping this is the right one.
01:27:37.000 Is this it?
01:27:38.000 It says, CEO Tim Cook banning hate division is the right thing to do 12-3-2018.
01:27:45.000 Is that it?
01:27:45.000 December?
01:27:46.000 That's it.
01:27:46.000 Okay, let's hear it.
01:27:48.000 Hello, Tim.
01:27:50.000 Volume, please.
01:27:51.000 Our devices connected to the humanity that makes us, us.
01:27:59.000 We do that in many ways.
01:28:01.000 One of the most important is how we honor a teaching that can be found in Judaism, but is shared across all faiths and traditions.
01:28:12.000 It's a lesson that was carried forward by the late Elie Wiesel.
01:28:19.000 May his memory be a blessing.
01:28:22.000 It's a lesson put into practice by America's Muslim community who raised thousands for the victims of the Tree of Life killings.
01:28:35.000 Do not be indifferent to the bloodshed of your fellow man.
01:28:39.000 Do not be indifferent.
01:28:45.000 This mandate moves us to speak up for immigrants and for those who seek opportunity in the United States.
01:28:54.000 We do it not only because their individual dignity, creativity, and ingenuity have the power to make this country an even better place, but because our own humanity commands us to welcome those who need welcome.
01:29:14.000 It moves us to speak up for the LGBTQ community, for those whose differences can make them a target for violence and scorn.
01:29:24.000 We do so not only because these unique and uncommon perspectives can open our eyes to new ways of thinking, but because our own dignity moves us to see the dignity in others.
01:29:40.000 Perhaps most importantly, it drives us not to be bystanders as hate tries to make its headquarters in the digital world.
01:29:52.000 At Apple, we believe that technology needs to have a clear point of view on this challenge.
01:29:59.000 There is no time to get tied up in knots.
01:30:03.000 That's why we only have one message for those who seek to push hate, division, and violence.
01:30:14.000 You have no place on our platforms.
01:30:27.000 You have no home here.
01:30:30.000 From the earliest days of iTunes to Apple Music today, we have always prohibited music with a message of white supremacy.
01:30:39.000 Hold on a second.
01:30:44.000 What do you think they're signaling here?
01:30:47.000 Like, are they signaling that they're about to start censoring things?
01:30:50.000 They already are.
01:30:51.000 They already are.
01:30:52.000 Okay, I agree that you probably shouldn't put white supremacy music on, but there's a lot of really violent stuff that you can get on iTunes, right?
01:31:01.000 I mean, if you go back to the old NWA albums, that's available, right?
01:31:06.000 Oh.
01:31:07.000 I'm assuming, yeah.
01:31:09.000 I don't know that it is for sure, but yeah.
01:31:10.000 Like, Straight Outta Compton?
01:31:12.000 That is some violent shit.
01:31:14.000 And then how about the films that they have?
01:31:16.000 How about the films that you can get on the iTunes store?
01:31:19.000 There's a lot of very, very, very violent films.
01:31:23.000 Like, extremely violent.
01:31:25.000 There's a lot of films that, like...
01:31:27.000 Is it that they're making the distinction between something that's fiction, that although it may be disturbing, you understand that this is a movie and this is something someone wrote versus someone...
01:31:38.000 Art versus...
01:31:40.000 Yeah, versus someone with commentary, their commentary.
01:31:43.000 And then here's the other thing.
01:31:44.000 He was saying hate...
01:31:47.000 And division.
01:31:48.000 They won't promote division.
01:31:51.000 But that's a weird one.
01:31:52.000 Yeah, that means...
01:31:53.000 Like, what does that mean?
01:31:53.000 People who disagree with you.
01:31:55.000 Yeah, what is division?
01:31:56.000 He has good intentions.
01:31:57.000 You can sort of feel it.
01:31:58.000 That's the problem with this.
01:32:00.000 Right.
01:32:00.000 That it's...
01:32:02.000 He's not allowing the conversation to take place.
01:32:05.000 So this is in direct conflict with the Daryl Davises with confronting these issues.
01:32:12.000 Right.
01:32:13.000 So...
01:32:14.000 But we can kill it.
01:32:15.000 But I think...
01:32:17.000 When he's saying, you have no place on our platform, they probably feel like you can go somewhere else.
01:32:25.000 He's building a wall.
01:32:27.000 Yeah.
01:32:28.000 I mean, but this is what I'm saying.
01:32:29.000 Everybody kind of feels like you can go somewhere else.
01:32:31.000 That's what happens, though, and that's how things get more radicalized.
01:32:34.000 And everybody goes to Gab.
01:32:36.000 So, I don't know.
01:32:38.000 Look, the conversation needs to take place.
01:32:41.000 People on the left, he's acting like he's speaking for all LGBTQ people.
01:32:46.000 He's not.
01:32:47.000 There's lots of people on the left, and LGBTQ people aren't always on the left.
01:32:53.000 And not all of them want that.
01:32:55.000 Well, not only that, there's division in LBGT and Q. There's a big issue right now with Martina Navratilova that was going on about her discussing the reality of trans women competing against biological women and that she opposes it and she thinks there's some fundamental advantages which is leading to a lot of weightlifting world records being broken by trans women and she's like,
01:33:21.000 this is fucking preposterous.
01:33:23.000 Including trans women with penises.
01:33:25.000 Now they're attacking her for being transphobic.
01:33:27.000 So there's not even a united opinion in the LBGTQ community.
01:33:33.000 For sure.
01:33:34.000 And that's why that Megan Murphy, I think?
01:33:37.000 Yes.
01:33:38.000 I go to this restaurant in Bridgeport, Connecticut called Bloodroot, which is like sort of an old-school feminist-like vegetarian vegan spot.
01:33:47.000 In Bridgeport?
01:33:48.000 In Bridgeport.
01:33:49.000 Really?
01:33:49.000 Yeah, yeah, yeah.
01:33:50.000 Bridgeport's kind of, no offense.
01:33:52.000 I know.
01:33:52.000 It's kind of a dumb...
01:33:53.000 It's a pretty wild place, yeah.
01:33:54.000 We would have a...
01:33:55.000 Gathering in the Vibes Music Festival is cool.
01:33:57.000 I helped organize that.
01:33:59.000 I used to do stand-up in Bridgeport.
01:34:00.000 There's a place called the Joker's Wild.
01:34:02.000 It was a comedy club.
01:34:04.000 I saw the owner beat a guy with a shoe there.
01:34:08.000 Beat a guy in the face with a shoe.
01:34:09.000 Pulled a shoe off and smacked him in the face.
01:34:12.000 I was 24. I didn't know what the fuck was going on.
01:34:16.000 So anyway, that restaurant, they get called, what is it, Turf?
01:34:23.000 Yeah, trans, exclusionary, radical, feminist.
01:34:27.000 And so, again, they're the old school ones.
01:34:30.000 And they're saying, look, we're not against your battle.
01:34:35.000 We're not against trans rights.
01:34:36.000 Who would be against trans?
01:34:38.000 But they're just saying that's not our thing.
01:34:40.000 So, again, there's diversity.
01:34:43.000 They're trying to clump everyone together in the whole intersectional world.
01:34:47.000 Look, people want to band together.
01:34:50.000 The oppressed groups want to band together.
01:34:52.000 They should.
01:34:53.000 But it's not that simple.
01:34:55.000 Well, there's always going to be differing opinions, and especially when you have something like...
01:35:02.000 Trans women competing against biological women and you know you have someone like Martina Davratilova that made her her life's work and her career competing as a biological woman.
01:35:15.000 She's gonna have some opposition to that and then the idea that everyone's supposed to be lumped in together with some mandate that no one is really openly discussed you're supposed to agree and it fluctuates and moves like the tide you know like what is and is and moves like the tide It just changes.
01:35:32.000 It's like this court of public opinion.
01:35:34.000 It's constantly rendering new verdicts.
01:35:37.000 And you have to keep up and catch up.
01:35:39.000 Things that were acceptable just a few years ago are totally unacceptable.
01:35:43.000 I mean...
01:35:44.000 Comedy is the key area, too.
01:35:47.000 Yeah.
01:35:47.000 It is not...
01:35:48.000 What's happening on social media now is not sustainable for comedy.
01:35:52.000 It's fine.
01:35:54.000 Oh, really?
01:35:54.000 Yeah.
01:35:55.000 It is.
01:35:56.000 It is.
01:35:57.000 How?
01:35:57.000 Because it creates outrage.
01:35:59.000 And then comedy relieves that pressure.
01:36:02.000 Like, believe me...
01:36:04.000 There's a lot of blowback, and believe me, there's a lot of debate and discussion, but also, believe me, when someone does do some politically incorrect, really good stand-up, people go fucking bonkers.
01:36:16.000 They love it.
01:36:17.000 It's one of the best times ever right now to do stand-up.
01:36:21.000 People go fucking apeshit.
01:36:22.000 Oh, yeah, no, it's incredible material, but I'm just saying, for comics that are running into issues with getting banned or whatnot, I mean...
01:36:33.000 Well, who's running into issues with getting banned?
01:36:36.000 I mean, I think you know one.
01:36:39.000 Owen?
01:36:39.000 Yeah.
01:36:40.000 Yeah, Owen's had some issues.
01:36:41.000 Yeah.
01:36:43.000 And, you know, you can make some arguments that Owen's not doing so well right now.
01:36:48.000 But he's also developing his following because of the fact there's people that don't agree with him being banned.
01:36:54.000 He's a very specific example.
01:36:57.000 Other people that are being banned...
01:37:01.000 Do you know what other stand-up comedians can you think of?
01:37:06.000 Maybe they haven't been fully banned from social media, but they've had their performances shut down.
01:37:12.000 Who's that one guy?
01:37:13.000 Oh, Namesh Patel.
01:37:14.000 But that was at a college.
01:37:17.000 It's the same thing.
01:37:19.000 Yes.
01:37:20.000 But universities have been bad for that for a long time.
01:37:24.000 They're the most sensitive of all audiences.
01:37:27.000 And they're the ones who are the most...
01:37:30.000 They believe the most that they're going to change the world and that their ideals are...
01:37:36.000 Their ideals are rock solid and they have to push back against anything that opposes them.
01:37:40.000 Ari was temporarily banned.
01:37:42.000 That was an accident.
01:37:44.000 The Ari thing was he was joking around and they thought he was making a legitimate death threat.
01:37:48.000 He was joking around with a good friend of ours.
01:37:51.000 The algorithms and the moderators are just not...
01:37:55.000 We can't just be having this happen all the time and then they just keep saying, oh sorry, oh sorry.
01:38:01.000 There has to be a new approach completely.
01:38:04.000 It can't just be, oh, let them back on and just keep doing what they're doing.
01:38:10.000 We need to completely re-approach how moderation is happening, the whole policy situation, the transparency situation.
01:38:20.000 It's not just a matter of...
01:38:23.000 Right.
01:38:26.000 Right.
01:38:28.000 Right.
01:38:44.000 One of his company's actions was that he believes that the ability to communicate is a fundamental right, like the ability to get electricity.
01:38:50.000 Like, if you're in the KKK, you can still order electricity.
01:38:54.000 So, should you be able to just distribute information?
01:38:57.000 If people say no, then you have to say, okay, well, who's to decide what can and cannot be distributed, and then who's to decide if they can go somewhere else?
01:39:06.000 And then what happens if you tell a person they can't go anywhere?
01:39:09.000 Then things get really weird.
01:39:11.000 We're looking at more of a community moderation structure so that we've even been considering like a juror system so that if we make a bad decision and someone appeals it, then the community can potentially...
01:39:33.000 I think that's where it's going to go.
01:39:41.000 I don't know.
01:39:44.000 That's the uncensorable internet.
01:39:49.000 And this idea that we can do things and then...
01:39:52.000 Just delete them.
01:39:52.000 In the GDPR, the European privacy laws have this whole idea of the right to be forgotten online, which is very difficult because deleting things from any database, especially a blockchain, is not easy.
01:40:06.000 So the idea that you can go on the internet, do crazy shit, and then just have it taken away, it's a paradox because privacy means control, but It doesn't jive with the way that technology works to just be able to delete things.
01:40:27.000 You're writing to a database.
01:40:30.000 That's not even how the universe probably really works.
01:40:32.000 You can't just say, oh, I just went and punched that guy in the face in the bar and I just want to delete that from having happened.
01:40:38.000 Right.
01:40:41.000 Yeah.
01:40:42.000 Yeah.
01:40:44.000 And again...
01:40:45.000 I think that what we're dealing with now is like you have to interface with it, right?
01:40:51.000 You have to interface with your computer, you have to interface with your phone to access all this stuff.
01:40:57.000 My real concern is that that's just a temporary step.
01:41:01.000 And that we're going to just consistently and constantly be interfaced with all of each other.
01:41:07.000 You know, Elon brought something up when he was on the podcast called Neural Link.
01:41:12.000 And he didn't want to fully describe it because he said he couldn't, but he said it's going to be live within a matter of X amount of months.
01:41:18.000 And he was talking about it increasing the bandwidth between human beings and information in a radical way that's going to change society.
01:41:26.000 That is what I'm talking about.
01:41:28.000 Yeah, he's talking about an injection.
01:41:29.000 You're basically doing like your throat and it's a neural lace and it just threads around your brain.
01:41:36.000 What?
01:41:36.000 And yeah.
01:41:37.000 Are you serious?
01:41:38.000 Yeah, that's what he's talking about?
01:41:40.000 Yeah, yeah.
01:41:41.000 He's a damn alien.
01:41:43.000 Trying to turn us into robots.
01:41:44.000 So, but the question is, what's the nature of those robots?
01:41:47.000 Robots are going to exist.
01:41:48.000 They exist.
01:41:50.000 Right, but should you shoot them into your brain?
01:41:52.000 If you're dying of cancer, would you?
01:41:54.000 Go for it.
01:41:55.000 Yeah.
01:41:56.000 I'd want to see God and see what's up.
01:41:58.000 So, like, do the nanobots, you know, that Kurzweil talks about, like, do we have control as a community over those robots?
01:42:07.000 What's the code running those?
01:42:09.000 And are they infallible?
01:42:11.000 I mean, what if they crash?
01:42:12.000 I mean, our fucking TriCaster crashes every other podcast.
01:42:15.000 Yeah, whether it's open source or free or not makes no difference to whether it can fuck up your brain.
01:42:20.000 Right, what if somebody puts that shit in and then, for whatever reason, they have a blown fuse and they stomp on the gas and drive right into a tree?
01:42:26.000 It depends on the level of risk you're willing to take.
01:42:29.000 I mean, you see some of those videos, like, I've cried at those videos where, like, the woman, like, hears for the first time, you're like, oh, Yeah, yeah, yeah.
01:42:35.000 And people seeing color for the first time, putting on certain glasses that allow them to see color.
01:42:40.000 Yeah, all this stuff is amazing.
01:42:42.000 I mean, all that stuff is very cool.
01:42:44.000 And in talking to David Sinclair, and he was talking about emerging technologies with reversing aging and age-related diseases.
01:42:53.000 I mean, we're entering into an incredibly strange time for the...
01:43:00.000 The influence of technology and innovation on human beings, on our bodies, on our brains.
01:43:06.000 And we're going to have to decide how far you want to go on this ride.
01:43:10.000 Next stop, Far Rockaway.
01:43:11.000 When are you getting out?
01:43:12.000 I lived there.
01:43:13.000 Did you?
01:43:13.000 Yeah.
01:43:14.000 I lived on the beach there.
01:43:16.000 I'm going.
01:43:17.000 No, I don't know.
01:43:18.000 But you know what I'm saying.
01:43:19.000 It's like one of those things like, where do you get off?
01:43:21.000 Where do you go?
01:43:22.000 Okay, that's enough.
01:43:24.000 You know, like with you, you're deleting Facebook, you delete Instagram, and you're just going to be on Mines, and that's enough.
01:43:31.000 I'll go in other places.
01:43:32.000 I mean, there are alternatives that are getting very big.
01:43:36.000 Yes, like what?
01:43:38.000 And together...
01:43:39.000 Like what?
01:43:40.000 Like, Signal has tens of millions of users.
01:43:42.000 I don't know what that is.
01:43:43.000 I've never heard of it.
01:43:44.000 That's like the encrypted messaging app that...
01:43:46.000 Do you know it?
01:43:48.000 It's open source.
01:43:49.000 What is it?
01:43:50.000 Snowden is on their advisory board or whatnot.
01:43:53.000 What is Signal?
01:43:54.000 It's just a messaging app.
01:43:56.000 So a messaging app, like a WhatsApp or like a Twitter?
01:43:59.000 Yeah, like WhatsApp.
01:43:59.000 So you have to know the person and then contact them through it?
01:44:02.000 Yeah, but we're considering using the Signal protocol for our messaging system because our messaging system needs an upgrade.
01:44:10.000 All of us together are going to be able to create a group of apps that are a more open, freedom-supporting privacy alternative.
01:44:24.000 We're not going to solve it by ourselves.
01:44:28.000 It would be way easier if one of these big companies would just switch gears and start doing things the right way.
01:44:35.000 Eight years building this.
01:44:38.000 If one of the big companies, Google, Facebook, had just been free and open source, we would have spent the last seven years building on top of them.
01:44:48.000 Right.
01:44:49.000 Because, you know, they already did something cool that they're sharing with everybody.
01:44:52.000 Right.
01:44:52.000 So it's actually closed source projects stifle innovation.
01:44:57.000 Because if you think, we had to reinvent the wheel.
01:44:59.000 We went and built an alternative with much of the similar functionality.
01:45:03.000 Think about how much further the world would be if everyone was building on top of more common projects.
01:45:10.000 Okay, but you're looking at it in terms of your own personal benefit.
01:45:14.000 You're looking at it in terms of mine's personal benefit.
01:45:17.000 I mean, you created this thing.
01:45:18.000 It's not just pure for altruistic reasons.
01:45:21.000 It's a business, right?
01:45:28.000 I think?
01:45:44.000 Do we allow these, air quotes, overlords to dictate what can and cannot be distributed?
01:45:49.000 And how did this happen?
01:45:50.000 Because in the beginning, I bet it didn't happen.
01:45:53.000 I bet in the beginning, you could just put on whatever the fuck you wanted.
01:45:56.000 And then they had to deal with that.
01:45:57.000 And then they had to figure out after a while, okay, maybe we shouldn't have this on.
01:46:01.000 If we're going to sell advertising, we really should maximize the amount of clicks.
01:46:05.000 Okay, how do we do that?
01:46:06.000 Well, we put things in people's feeds that they want to see.
01:46:09.000 We put things that people want to debate about and argue about and political things, all sorts of different things that excite them and get them to be engaged with the platform.
01:46:17.000 That's their business.
01:46:19.000 Their business is...
01:46:20.000 I mean, it's no different in a lot of ways than Amazon or than any other business that wants to grow.
01:46:26.000 How do they grow?
01:46:28.000 Well, they grow by maximizing their profits and by maximizing the amount of eyes that get to their advertising so they get more clicks and more people get engaged.
01:46:38.000 That's what their business is.
01:46:39.000 You're deciding.
01:46:41.000 By saying, if they were open source, look how much further along the world would be.
01:46:44.000 They would be further along, too.
01:46:45.000 I don't know if they would agree with that.
01:46:47.000 I think they're worth fucking kajillions of dollars, so they've figured it out.
01:46:51.000 Well, it just depends on whether or not you think that people have a right to know what is going on.
01:46:58.000 I mean, it's like food transparency.
01:47:00.000 I will talk about that until the end of time.
01:47:06.000 We're interfacing with this, and it's affecting us.
01:47:09.000 I agree.
01:47:10.000 I fully agree with what you're saying.
01:47:12.000 I'm playing devil's advocate by saying that in their position, they have a business, and their business is to make money.
01:47:18.000 And they're going to lose because of what they're doing.
01:47:21.000 Because it's not sustainable.
01:47:23.000 But their business is up.
01:47:24.000 They're losing active users.
01:47:25.000 Are they?
01:47:26.000 Yeah.
01:47:26.000 But I thought their business went up after the hearings.
01:47:29.000 Probably.
01:47:29.000 Did it?
01:47:30.000 But it's not going to last.
01:47:31.000 Why do you say that?
01:47:33.000 It's just the game's over.
01:47:35.000 It's going to take a long time for us to build it up as all of these different organizations and companies working together.
01:47:41.000 But Linux, for instance, is the operating system that most banks – it is the most popular operating system in the world.
01:47:50.000 It's open source.
01:47:51.000 Yeah, it's open source.
01:47:52.000 It's in your phone.
01:47:53.000 It's in everywhere.
01:47:56.000 It got there because of that.
01:47:57.000 Because everyone used it and incorporated it into their product.
01:48:01.000 Facebook, they are all using free and open source software in their stacks.
01:48:06.000 They're just not sharing their product with everybody else.
01:48:09.000 So they're benefiting from it but not giving back.
01:48:12.000 And I almost feel like I shouldn't even be saying that they should just pivot because that's their only chance to survive.
01:48:22.000 So this is based on your estimations of the future.
01:48:27.000 Yeah, it just seems like things are becoming more open.
01:48:30.000 Is that possible because you engage with a lot of other super nerds and you guys all have these similar ideas?
01:48:38.000 Look at what's happening with Bitcoin.
01:48:40.000 I don't know what's happening with Bitcoin.
01:48:42.000 Bitcoin and Ethereum and lots of other blockchains are growing really fast.
01:48:49.000 Maybe the price is separate.
01:48:51.000 The development energy, the number of people who are building apps on top of Bitcoin and Ethereum is growing massively.
01:48:58.000 It's a whole new infrastructure that's like a common protocol that people can build on.
01:49:05.000 So that is growing rapidly.
01:49:07.000 The price is secondary.
01:49:09.000 That's not even what Bitcoin and Ethereum are really about.
01:49:12.000 It's a decentralized database.
01:49:17.000 So, this is just where the internet is meant to be decentralized.
01:49:21.000 It sort of started out that way.
01:49:23.000 And then we moved into this, like, Web2 silo system with, like, just these massive companies that are controlling everything.
01:49:30.000 But it's going to keep waving.
01:49:33.000 Okay, again, to play devil's advocate, the vast amount of users are not using those platforms.
01:49:38.000 The vast amount of users are using these controlled platforms like Facebook and Instagram and Twitter.
01:49:44.000 Like, if you're talking about, I'm just guessing, but if you're talking about the gross number of human beings that are interacting with each other on social media, they're mostly uncontrolled networks.
01:49:56.000 You're saying that this is not going to last.
01:49:59.000 But there's no evidence that it isn't going to last.
01:50:02.000 There's tons of evidence.
01:50:03.000 What is the evidence?
01:50:04.000 Wikipedia.
01:50:05.000 What happened in Carta?
01:50:07.000 Remember that disk you put in your computer that was your encyclopedia?
01:50:11.000 Where is that?
01:50:12.000 No one uses it.
01:50:14.000 Okay, that's different.
01:50:15.000 This is not a social media network.
01:50:17.000 The social media networks that people are using are almost all controlled, right?
01:50:22.000 Yeah.
01:50:23.000 No, it's going to take a very, very long time.
01:50:25.000 How long?
01:50:28.000 I would say 10 years.
01:50:31.000 And what do you think is going to be the catalyst?
01:50:33.000 Like what's going to cause these people to make this radical shift to open source?
01:50:37.000 I think we have to be – we have responsibility to be competitive functionally.
01:50:42.000 Mines does.
01:50:43.000 Yeah, we do.
01:50:44.000 We're moving there fast.
01:50:46.000 Like we just hired a ton of new developers and – It's going to take time.
01:50:50.000 We're not there yet.
01:50:51.000 Right.
01:50:52.000 But once we have functionally competitive products that you wouldn't even know the difference and there's enough people there, then it's basically the decision of, you know, am I going to choose the one that respects my privacy and freedom or the one that doesn't?
01:51:08.000 And people are – kids don't like Facebook.
01:51:11.000 Everyone is sick of it.
01:51:13.000 We're just drug addicts.
01:51:16.000 Right.
01:51:18.000 Is that what it is?
01:51:19.000 They're just sucked into this thing where you constantly want to check and see who's writing what?
01:51:25.000 Yeah, and there's monopolies, arguably.
01:51:27.000 Yeah.
01:51:28.000 Yeah, right?
01:51:30.000 Especially when Facebook owns Instagram, right?
01:51:32.000 What if they bought Twitter as well?
01:51:34.000 They almost did.
01:51:35.000 I think Google almost did.
01:51:36.000 What if Google steps in and buys everything?
01:51:39.000 Then you're like, oh no.
01:51:41.000 They probably could, right?
01:51:43.000 Yeah.
01:51:44.000 They easily could.
01:51:45.000 They could probably buy everything.
01:51:46.000 Apple could with cash.
01:51:48.000 Yeah.
01:51:48.000 Yeah.
01:51:49.000 Tim Cook could come in with a big purple pimp suit on, just slap down a briefcase.
01:51:53.000 Bitch!
01:51:54.000 I just wonder, and like, look, all these executives...
01:51:59.000 Jack seems like a cool person.
01:52:01.000 He's a very nice guy.
01:52:03.000 I just sense so much inconsistency.
01:52:08.000 He's talking about Bitcoin like it's this important new internet money.
01:52:14.000 He knows the infrastructure is open, but then his platforms are the opposite.
01:52:23.000 Why is he so inconsistent?
01:52:25.000 It's just hypocritical to the maximum.
01:52:29.000 I think it's partly because it's a giant business.
01:52:32.000 And I think when you have an obligation to your shareholders and to maximize profits...
01:52:37.000 And when you're trying to maximize profits, too, and there's this universal growth model where every year it just has to get a little bit bigger, otherwise you're fucking up as a CEO. You don't have to experience that with Mines.
01:52:48.000 You're one of the co-founders.
01:52:50.000 How many people are involved...
01:52:52.000 It's like 15 of us now.
01:52:54.000 And do you have like a board where you sit around where you make critical decisions?
01:52:58.000 Is that stressful as fuck?
01:53:00.000 Yeah.
01:53:00.000 Luckily, we've started off from the point where we're saying, okay, we're embedding principles into how we're doing things.
01:53:07.000 So we're not in a position where we would ever change that.
01:53:14.000 For us to do that would just be a total waste of time.
01:53:18.000 Right.
01:53:19.000 So we're making it harder for ourselves to make money in the beginning.
01:53:24.000 We're making it harder for ourselves to grow because we are not going to compromise people's privacy in order to do those things.
01:53:31.000 And so we're just going to build up slowly, steadily, and just get there when we get there.
01:53:38.000 How much time a day is this?
01:53:40.000 How much of an obligation is this for you?
01:53:44.000 Same as any job.
01:53:46.000 It sucks because, you know, my wife Allie would say, like, it's too blurred, my life.
01:53:54.000 Because it's like, what is pleasure?
01:53:57.000 I mean, it probably happens with you, too.
01:53:59.000 Like, when you're on...
01:54:01.000 Your phone.
01:54:02.000 Like, your family doesn't know if you're working or if you're doing something for fun.
01:54:10.000 Yeah.
01:54:10.000 Because, like, your work is sort of in the digital realm, partially.
01:54:13.000 A lot of it is.
01:54:14.000 Yeah.
01:54:14.000 So it's like, I just need to put it down, like, no phones in bed, these kinds of things.
01:54:19.000 Like, strict lines.
01:54:21.000 Yeah.
01:54:22.000 That is huge.
01:54:24.000 Strict lines are huge.
01:54:25.000 Yeah.
01:54:26.000 Yeah, putting your phone in a physical place and pushing it away.
01:54:31.000 It's huge.
01:54:32.000 It's just the compulsion to look and check Instagram feeds to see if there's any cool pictures.
01:54:38.000 Like, what the fuck am I doing?
01:54:40.000 Why am I compelled to do this?
01:54:42.000 There's no benefit.
01:54:44.000 Occasionally, if I'm bored, like I'm in the dentist's office, you have 10 minutes, all right, let's see what the fuck's going on in the news.
01:54:50.000 Like, maybe.
01:54:51.000 Occasionally.
01:54:52.000 But there's so much of your time that's dedicated to that.
01:54:56.000 So much of it.
01:54:58.000 It's so taxing, and it's so involved, and so many people are doing it.
01:55:04.000 I mean, I went to a restaurant the other day, and I was looking around, and fucking everyone was sitting at a table looking at their phone.
01:55:11.000 It's weird.
01:55:11.000 You ever do the stack game?
01:55:13.000 What's that?
01:55:14.000 Just like, if you're out to dinner with a bunch of people, just everyone's put their phone in a stack in the middle.
01:55:18.000 Well, you could do that or just...
01:55:19.000 And if you touch it, then you pay.
01:55:21.000 Oh.
01:55:21.000 I'd rather just...
01:55:22.000 Just not?
01:55:23.000 Not.
01:55:24.000 Yeah.
01:55:25.000 People wipe their butts and don't fucking wash their hands and touch their phone.
01:55:29.000 And, you know, your phone is filled with all kinds of dirty shit.
01:55:32.000 They've, like, done these swab tests of phones.
01:55:34.000 They're covered with E. coli.
01:55:36.000 And people are gross.
01:55:39.000 I'm a little germaphobe.
01:55:40.000 Keep my fucking hands clean, bro.
01:55:42.000 I'm not, but I know your phone probably has your butt all over it.
01:55:45.000 True.
01:55:46.000 Just be honest.
01:55:47.000 True.
01:55:48.000 But I know what you're saying.
01:55:50.000 It's a good idea.
01:55:51.000 You know, the guys who run Joe Beef in Montreal, it's this amazing restaurant, Fred and Dave, and they were talking about it, that when they go to dinner, they shut their phone off.
01:56:05.000 I'm a good guest, a good table guest.
01:56:07.000 I shut my phone off.
01:56:09.000 I don't engage.
01:56:10.000 I don't check it.
01:56:12.000 It's a similar thing to podcasting, in a way, in that one of the good benefits of podcasting is that for three hours or two hours, whatever the fuck you're doing, you're going to sit down, and you're just going to engage with the person.
01:56:25.000 Just you and I. Me and Bill.
01:56:27.000 We're just talking, right?
01:56:28.000 And that we're not checking our phone.
01:56:30.000 We're not looking at the television.
01:56:31.000 We're not looking at the laptop.
01:56:33.000 There's no distractions.
01:56:35.000 And that is one of the rare moments in life where you get to talk to someone for several hours.
01:56:39.000 And over the last, you know, nine years that I've been doing this podcast, it's benefited me tremendously just in having real conversations with people.
01:56:51.000 We're just sitting across from somebody for hours just talking to them.
01:56:55.000 Getting better at understanding how people think, getting better at understanding how I think, getting way better at communicating and knowing when to talk and when not to talk and what questions to ask and try to understand the thought process that another person has.
01:57:10.000 And you walk out of that with some lessons, like real, legit, tangible lessons.
01:57:15.000 Those fucking don't happen when you're staring at your phone while you're talking to people.
01:57:18.000 It, like, cuts all that off.
01:57:20.000 The conversation stays shallow.
01:57:21.000 You miss important points.
01:57:23.000 Like, oh, I'm sorry, what?
01:57:24.000 What were you saying?
01:57:24.000 You do that kind of shit.
01:57:25.000 And, like, then the other person knows you're not engaged.
01:57:28.000 It's just...
01:57:28.000 It's weird.
01:57:29.000 Yeah, it's all shades of gray.
01:57:31.000 I mean, it's done incredible things for, like, democratizing the ability to share information so it's not just these...
01:57:37.000 Yeah.
01:57:38.000 Juggernaut media companies are the only places that can share information.
01:57:41.000 So it's incredible.
01:57:44.000 It's crucial.
01:57:46.000 We need everyone to have the ability to share and so that you can check because maybe you're more likely to get the reality of what's going on in the world from your newsfeed than the big companies.
01:57:57.000 We need management skills.
01:57:59.000 Personal management skills.
01:58:01.000 Yeah.
01:58:02.000 And I think we need to look at them the same way we look at alcohol consumption and even poor food choices.
01:58:10.000 You can have a cheat day and eat a bunch of pizza and some ice cream like The Rock does.
01:58:14.000 No one's going to get hurt, right?
01:58:16.000 But most of the time you should probably take care of your meat vehicle.
01:58:20.000 I think the same thing can be said of your mind.
01:58:22.000 I have a day, a week, where I will fucking plop down on the couch and I don't give a fuck.
01:58:29.000 I just watch bullshit on TV and just relax because I know that I'm redlining it six days a week.
01:58:37.000 And I'm doing three different things at a time.
01:58:40.000 I have three different jobs.
01:58:41.000 I'm working out.
01:58:41.000 I'm trying to take care of my family.
01:58:43.000 I'm writing comedy material.
01:58:44.000 And then, oh, let me see some documentary on some wacky fucking cult or whatever the hell I'm going to watch.
01:58:50.000 And I don't feel guilty when I do that because I know that I've kind of, air quotes, earned it.
01:58:56.000 But I think that that's mental management.
01:59:00.000 And I think we certainly need personal management when it comes to the use of electronic devices.
01:59:07.000 Yeah, personal challenges.
01:59:09.000 Yeah.
01:59:09.000 Challenge psychology is really fucking interesting to me.
01:59:14.000 Like, you guys do the October thing?
01:59:16.000 Mm-hmm.
01:59:17.000 We're thinking about doing it twice a year now.
01:59:18.000 I did this one with some of my friends.
01:59:21.000 Jamie's just going to watch shit.
01:59:23.000 Are you in it too?
01:59:24.000 No, no.
01:59:26.000 Jamie doesn't get in.
01:59:27.000 We did one called the 100 Burpee Challenge.
01:59:32.000 100 burpees a day?
01:59:33.000 I haven't done burpees in...
01:59:34.000 I did them today because I was coming out.
01:59:36.000 I was like, I'm going to fucking do burpees today.
01:59:37.000 Nice.
01:59:38.000 But 100 for time.
01:59:39.000 Every day for 100 days straight.
01:59:42.000 And you are drenched after going for time 100 burpees.
01:59:48.000 Oh, yeah.
01:59:49.000 Ridiculous.
01:59:49.000 Yeah, it's hard work.
01:59:50.000 And that was the best, most discipline I've ever been working out.
01:59:53.000 It was like me and five friends.
01:59:55.000 My friend's mom did it, too.
01:59:57.000 And it changed my life.
01:59:59.000 Really?
01:59:59.000 It was ridiculous.
02:00:01.000 How did it change your life?
02:00:02.000 I felt better than I have ever felt by far.
02:00:05.000 And that was like a year ago, and I've trailed off.
02:00:10.000 But not just physical challenges, like digital ones too.
02:00:14.000 And like with the ice bucket thing.
02:00:16.000 That was crazy shit.
02:00:18.000 I didn't get involved in that.
02:00:20.000 I didn't do it either.
02:00:20.000 But just watching it happen was just really powerful.
02:00:26.000 I'm like, I'm not throwing water in my head during a fucking drought.
02:00:30.000 Stop.
02:00:31.000 Everybody stop.
02:00:32.000 This is not fixing anything.
02:00:33.000 How about I just write a check?
02:00:35.000 I'll give you some money.
02:00:36.000 Yeah, film yourself writing the check.
02:00:38.000 Yeah, stop.
02:00:39.000 Yeah, throw a glass of water in my face when I'm done with the check.
02:00:41.000 Just stop.
02:00:43.000 But getting communities to sort of pressure each other into doing things.
02:00:50.000 In a positive way.
02:00:50.000 In a positive way.
02:00:51.000 Yeah.
02:00:52.000 Well, that was what the Sober October thing kind of turned out to be about.
02:00:56.000 And there's a lot of lessons in learning that, too.
02:00:59.000 You know, you learn lessons about your reliance on either substances or things.
02:01:05.000 And one of the things that I learned from the Sober October Challenge, the last one, was that when you engage in really rigorous physical activity six and seven days a week, you don't give a fuck.
02:01:16.000 Like, you don't give a fuck.
02:01:18.000 Like, all the chatter, the internal chatter just goes away.
02:01:21.000 All the negative chatter, like, it's like taking a pill.
02:01:25.000 Like, I don't give a fuck pill.
02:01:27.000 It's amazing.
02:01:28.000 It's really amazing because I think a lot of personal anxiety that people carry around with them is a physical energy that's not being expressed because I think the body has certain demands and certain potential and in order to have this certain potential like your potential for athletic output You have to have this energy source,
02:01:50.000 right?
02:01:50.000 And this body energy source when not expressed.
02:01:53.000 And when you're sitting in a cubicle all day, day after day after day, it builds this internal anxious feeling and tension.
02:02:00.000 And that becomes your normal...
02:02:12.000 We're good to go.
02:02:19.000 Blow that shit out every day.
02:02:21.000 Every day.
02:02:22.000 You burn off 2,000 calories and you fucking run for five miles and you do kettlebells and chin-ups and fucking hit the bag for five rounds.
02:02:30.000 Dude, that shit goes away.
02:02:31.000 You don't give a fuck.
02:02:33.000 And then you get to look at things with real clarity.
02:02:35.000 So there was a lesson learned in that.
02:02:37.000 And that lesson was only learned because we decided to challenge each other and push ourselves.
02:02:43.000 Do you think it would be too draconian to have a company 100 burpee a day policy?
02:02:50.000 Yeah, you know why, man?
02:02:51.000 I just don't think you should tell people what to do.
02:02:53.000 Their job is the job, and then everything else is like a cult.
02:02:58.000 You know?
02:02:58.000 It's like, no, we're only going to wear white robes.
02:03:01.000 We don't need anything about white robes.
02:03:02.000 Okay.
02:03:03.000 When do you start fucking everybody and taking their money?
02:03:06.000 Because that always comes next.
02:03:08.000 Yeah, because it's in vain.
02:03:09.000 Yeah.
02:03:10.000 You can't force people.
02:03:11.000 It's like being convinced.
02:03:12.000 But it wouldn't be a bad...
02:03:14.000 Well, the problem is if you were...
02:03:16.000 You could have some sort of a company-wide challenge where you invite people.
02:03:21.000 No, because that's what I'm saying.
02:03:23.000 You would shame them into doing it, or you would somehow or another make it seem like they would advance in the company more if they played along.
02:03:34.000 It could be...
02:03:36.000 Yeah.
02:03:37.000 I'm not going to do it.
02:03:38.000 Were you thinking about doing it?
02:03:40.000 I just did, maybe.
02:03:41.000 Because it feels so good!
02:03:43.000 Yes, yes.
02:03:44.000 Well, you should encourage it.
02:03:45.000 But I almost feel like...
02:03:47.000 Yeah.
02:03:49.000 But you don't want to shame people.
02:03:51.000 There's some brilliant people that don't work out at all.
02:03:53.000 They're brilliant.
02:03:54.000 But for whatever reason, that's their choice.
02:03:57.000 It should be your choice to go out like Christopher Hitchens and just fucking drink every day and smoke cigarettes and one day you get cancer.
02:04:03.000 And you're like, well...
02:04:04.000 You know, I mean, this is like...
02:04:07.000 I mean, the way he described it, like burning the candle at both ends, it gave a beautiful, brilliant light.
02:04:13.000 Yeah, he wouldn't have had those ideas if he had done it another way.
02:04:16.000 It's very possible that's true.
02:04:18.000 And most of the madness that we see in brilliant artists, it's very possible that that madness would not be expressed if they had their shit together.
02:04:28.000 There was something that Sam Harris was saying the other day, On your show just about the free will stuff.
02:04:35.000 And I think that connects to this information theory kind of thing.
02:04:39.000 So if we're just sort of a conglomerate of these actions and we're like flowing the actions through our body in unique ways...
02:04:52.000 I mean, do you accept his theory on free will?
02:04:57.000 Well, it's not his theory.
02:04:58.000 It's a conventional theory of determinism that a lot of people are embracing, and I think there's definitely some merit to it.
02:05:06.000 However, you and I both know that you choose whether or not you decide to do something, right?
02:05:11.000 You choose whether or not you...
02:05:14.000 Someone says something to you that's kind of shitty and you choose whether you decide to email them back something shitty.
02:05:20.000 Like you have that initial impulse.
02:05:21.000 Like, well, hey man, fuck you.
02:05:23.000 You have that initial impulse.
02:05:24.000 You think on it.
02:05:25.000 You sleep on it.
02:05:26.000 But why are you thinking on it and sleeping on it?
02:05:28.000 Are you doing that because of determinism?
02:05:30.000 Are you doing that because you're trying to be a better person?
02:05:32.000 And are you trying to be a better person because of all the factors that played out in your life?
02:05:37.000 Like...
02:05:37.000 Environment, genes, life experience, all those things.
02:05:42.000 It's a really good discussion.
02:05:45.000 So do you own the words that you're saying right now?
02:05:47.000 That's a good question.
02:05:48.000 Larry Lessig, who was on here the other day, you guys didn't even talk about this, but he basically is one of the founders of Creative Commons and this whole licensing structure for content.
02:05:56.000 Like what we're saying right now, this is going to be licensed.
02:05:59.000 I don't know how.
02:06:00.000 How's it going to be licensed?
02:06:00.000 You and I? Some form right here.
02:06:02.000 This discussion is going to be licensed?
02:06:04.000 Yeah.
02:06:05.000 I think you're licensing it in a certain way.
02:06:08.000 Okay.
02:06:08.000 So you have the ability to license it however you want.
02:06:11.000 You could say, hey, anyone can take this and cut it up and remix it.
02:06:15.000 Or you could say, no, it's locked down.
02:06:18.000 But he helped create this whole licensing array of like six different licenses.
02:06:25.000 One says, you can do absolutely anything you want with this.
02:06:28.000 Another says, you can share it, but you can't make money off it.
02:06:30.000 There's a handful.
02:06:31.000 And so...
02:06:33.000 The free will stuff is connected to how we're dealing with information.
02:06:40.000 And, like, if you...
02:06:41.000 Because if you think...
02:06:44.000 Realistically, we don't own what we're saying.
02:06:46.000 We're a part of it.
02:06:47.000 We're a conduit.
02:06:48.000 We're a unique conduit.
02:06:51.000 So I don't think it aligns with how the universe works to really be locking down information.
02:07:00.000 I think that it makes sense probably in certain short-term business ways.
02:07:07.000 But, you know, I think we have to open it up to what's really going on.
02:07:12.000 What do you mean by, like, locking down information?
02:07:15.000 Like source code.
02:07:17.000 Like classified files.
02:07:19.000 Like our content.
02:07:21.000 Like music.
02:07:22.000 Like video.
02:07:23.000 And now how does this connect to determinism and whether or not you have free will?
02:07:28.000 Because it's...
02:07:30.000 Are you the creator of your information?
02:07:34.000 Mm-hmm.
02:07:35.000 Well, you are certainly if you put in the work.
02:07:37.000 Like, let's say you decide to write a book.
02:07:39.000 I mean, you put hundreds and hundreds of hours into this book and edit this book and then you release the book and someone says, no, you didn't create that.
02:07:49.000 You're a product of determination and I'm going to just steal your book.
02:08:14.000 That's intellectual theft.
02:08:14.000 That's why attribution is the key part of the Creative Commons licensing structure.
02:08:19.000 Always saying, if you come up with a joke, you know, it came from here.
02:08:22.000 But, what if it's a profit?
02:08:24.000 Like, say if you wrote a book, and I say, hey, this is a great book written by Bill Ottman.
02:08:29.000 Give me five bucks for it.
02:08:30.000 I'm putting it up on my site.
02:08:31.000 Do it.
02:08:32.000 Fuck that.
02:08:32.000 I mean, here's the thing.
02:08:34.000 I... What if somebody makes all the money off of your book because they have a better platform to sell your book and they don't give it to you at all and you wrote the book.
02:08:41.000 You spent all the time.
02:08:42.000 You did all the work.
02:08:44.000 I would, for certain content that I create, completely give it away.
02:08:48.000 That sounds like a guy who's never written a book.
02:08:50.000 I've written a book.
02:08:51.000 Did you?
02:08:52.000 I mean, I've written a lot of content, yeah.
02:08:56.000 But have you written a book?
02:08:56.000 I give away – I've not published, but yeah.
02:09:00.000 But a book that you – like if you were an author.
02:09:02.000 But say if you were – I'm not saying people should be forced to do this.
02:09:06.000 I'm just saying that this, I think, is how creativity happens.
02:09:13.000 And I just don't – People deserve to make money on their content.
02:09:20.000 Right.
02:09:20.000 And you deserve to own your stuff.
02:09:22.000 But I don't think that that's actually how the universe works.
02:09:28.000 And I don't think it's acceptable to say, oh, free will doesn't exist.
02:09:31.000 I own your content.
02:09:34.000 Yeah, that's why I'm struggling to see how they're connected.
02:09:38.000 Because if you're not the originator, then you're not the owner.
02:09:43.000 That's a weird argument because you are the originator.
02:09:46.000 Stephen King wrote all Stephen King books.
02:09:48.000 You are.
02:09:48.000 You're the unique conduit.
02:09:51.000 You are the originator of that specific configuration of information.
02:09:56.000 And you deserve to be able to do everything you're saying with it.
02:09:59.000 I'm just saying that...
02:10:03.000 I don't know.
02:10:04.000 It's complex.
02:10:05.000 Well, it is complex if you're saying that all human beings, essentially, all of your actions have been determined by a lot of factors that are outside of your control.
02:10:18.000 Whether it's genetics, again, life experience, education, all the different factors.
02:10:23.000 Your environment.
02:10:25.000 Is that what's causing you to put out A fucking brilliant record.
02:10:33.000 It's part of it.
02:10:35.000 Maybe you have 50%.
02:10:37.000 Maybe you have 50%.
02:10:38.000 Everything else has 50%.
02:10:40.000 I don't know what the percentage is.
02:10:42.000 But if you're a musician...
02:10:44.000 And someone like Spotify comes along and says, boot, you didn't even make that, dude.
02:10:49.000 So we're just going to put it on Spotify and make millions and give you pennies.
02:10:52.000 That's not what I'm advocating.
02:10:54.000 I'm saying Led Zeppelin uses the blues.
02:10:59.000 Well, more than that.
02:11:00.000 More than that.
02:11:02.000 There's real plagiarism.
02:11:05.000 But that doesn't mean that those aren't great records, obviously.
02:11:09.000 It's true.
02:11:10.000 It is true.
02:11:11.000 Yeah, I mean, Led Zeppelin is a legit gray area.
02:11:15.000 You know, I found out this...
02:11:17.000 Bill Burr called me up and left this really disturbed message.
02:11:20.000 He was, like, really bummed out when he watched...
02:11:22.000 And Bill's a musician.
02:11:23.000 He's a drummer.
02:11:24.000 And when he saw videos of Led Zeppelin music played and the band that used to open for Led Zeppelin, we played it on the podcast.
02:11:33.000 We were like, holy shit.
02:11:35.000 Like, they just stole stuff.
02:11:38.000 They just stole giant chunks and riffs and, you know, and...
02:11:42.000 I mean, they made it better.
02:11:44.000 I guess.
02:11:47.000 But yeah, but that's a different thing than Stephen King's book.
02:11:50.000 Why?
02:11:50.000 Because Stephen King had to spend countless hours in front of his laptop trying to go over each and every sentence and each and every paragraph and suck you in and rope you in and all this work.
02:12:03.000 No, they stole stuff, dude.
02:12:05.000 They sold certain phrases, but you think he didn't use a single phrase.
02:12:09.000 Anywhere in any of his books that he didn't pull from somewhere?
02:12:14.000 No, he certainly has.
02:12:14.000 Yeah, he certainly has.
02:12:16.000 I don't think it's the same, though.
02:12:18.000 I think it's similar.
02:12:20.000 Led Zeppelin, they also had to spend countless hours recording that performance to get it to the level of awesomeness that we heard.
02:12:29.000 That wasn't easy to do.
02:12:30.000 They just did a bitch-ass move and they didn't pay those people.
02:12:34.000 Yeah, no matter what, you should be attributing.
02:12:37.000 If you're taking ideas, put it in the footnotes.
02:12:39.000 Why does it hurt?
02:12:39.000 It doesn't make your art worse.
02:12:41.000 Because then they'd have to admit they stole the riff for Stairway to Heaven from their opening band, and then people would go, what?
02:12:47.000 And then they would see it, and then they would look at Led Zeppelin differently.
02:12:51.000 But, you know, human beings are fucking severely flawed.
02:12:55.000 I don't know if I buy that with this idea that you're saying, in terms of authors creating content.
02:13:03.000 I'm not trying to sell something.
02:13:05.000 No, I know.
02:13:05.000 But if they are, I don't think someone should be able to copy their stuff and sell it.
02:13:10.000 I don't think they should either.
02:13:12.000 But what do you think they should be able to do it?
02:13:14.000 I think you should be able to decide.
02:13:16.000 Okay, you should be able to decide.
02:13:17.000 So if you're the content creator, you should...
02:13:19.000 Okay, I agree with that.
02:13:20.000 Yeah, it is, I mean, obviously I play devil's advocate a lot, but that's how you get to the bottom of these conversations.
02:13:27.000 But it is a very complicated issue.
02:13:28.000 The complicated issue of who you are and why you are who you are and who you are at this moment versus who you are a decade ago or two decades ago.
02:13:36.000 It's all very weird, you know?
02:13:39.000 I mean, you go back and think about stuff from high school and you're like, Jesus, am I really even that person?
02:13:44.000 Yeah.
02:13:46.000 I've talked to my sister about stuff that happened when we were in high school.
02:13:49.000 Hey, you remember that guy?
02:13:50.000 Oh, he said to say hi.
02:13:52.000 Is that even me?
02:13:53.000 Do I even know that person?
02:13:54.000 Is that really me?
02:13:55.000 If I see them again, I'll be like, oh yeah.
02:13:59.000 Oh yeah, we had 10th grade science together.
02:14:02.000 Oh yeah.
02:14:03.000 Huh.
02:14:03.000 Crazy.
02:14:04.000 I ran into a guy from my high school a couple of weeks ago.
02:14:07.000 It was weird.
02:14:08.000 It was so weird.
02:14:09.000 You know, he remembered some strange story from English class.
02:14:12.000 And I was like, wow, you remember that?
02:14:13.000 Like, how weird.
02:14:14.000 And while he was talking to me, I'm like, is that even really me?
02:14:18.000 Like, is he even really talking about me?
02:14:20.000 Because I don't have any connection to the stuff that he's saying.
02:14:24.000 And I understand that he has this vague, distant, ghost-like memory in his mind of some slide images that he's pieced together that he recognizes as a past interaction.
02:14:37.000 I mean, it's...
02:14:39.000 Fucking strange.
02:14:40.000 It's super strange too, like in, you know, 50 or 20, 45, whatever.
02:14:45.000 You know, if your body can be replaced one piece at a time, as time goes on, then your body literally, you could survive, but your body is going to be like almost completely different.
02:14:59.000 Exactly.
02:14:59.000 It's like the boat analogy.
02:15:03.000 Was it Graham Hancock that used that analogy?
02:15:06.000 Somebody used this analogy of certain boats that are like really ancient boats that are on display and every single piece of them from the original boat has been replaced because they rotted away.
02:15:15.000 And you're like, okay, what am I looking at?
02:15:17.000 What is this really?
02:15:19.000 Yeah.
02:15:19.000 And that's kind of us.
02:15:21.000 And once that becomes a physical thing.
02:15:25.000 I met the guy who got his arm and his leg bitten off by a shark.
02:15:31.000 You ever see that guy?
02:15:32.000 He's got carbon fiber arms and legs.
02:15:33.000 John Joseph brought him to the comedy store.
02:15:35.000 I met him at the UFC when you gave him tickets.
02:15:38.000 I shook his hand and I was like, oh shit, that was weird.
02:15:41.000 Super nice guy, but he's got this...
02:15:44.000 It's like a carbon fiber hand and forearm that moves around like a hand.
02:15:51.000 He shakes your hand and then he walks with no limp.
02:15:54.000 He's got this carbon fiber, I think from the knee down the shark bit his leg off.
02:16:00.000 It's fascinating.
02:16:01.000 You're like, okay, you're still a person.
02:16:03.000 You're still here.
02:16:05.000 There he is.
02:16:06.000 There's a gentleman right there.
02:16:07.000 Paul DeGelder.
02:16:08.000 Paul DeGelder.
02:16:09.000 Super nice guy.
02:16:10.000 But that is a fake arm that he's got from his arm being chomped off by a fucking shark.
02:16:18.000 See where it is?
02:16:19.000 From his right thigh, like mid-thigh down, and his right elbow down, all that shit chewed off by a shark, and he's still jacked.
02:16:28.000 Look at him.
02:16:30.000 No excuses.
02:16:31.000 And that's going to keep becoming more biological.
02:16:33.000 Right.
02:16:33.000 Well, the real concern is, remember Six Million Dollar Man?
02:16:38.000 Do you remember that television show?
02:16:39.000 No, you're younger than me.
02:16:40.000 There was a show called The Six Million Dollar Man.
02:16:43.000 And The Six Million Dollar Man, he had been in some sort of a pilot accident.
02:16:48.000 And the gentleman, we can rebuild him.
02:16:50.000 We can make him better than he was.
02:16:52.000 Better, stronger, faster.
02:16:54.000 And they give him these bionic parts.
02:16:56.000 They gave him a bionic arm, and they gave him bionic legs.
02:16:59.000 And he could run like 60 miles an hour, like...
02:17:01.000 He would run like crazy fast and he had these artificial arms and artificial legs.
02:17:05.000 Then they had a bionic woman, same shit, except she was hot.
02:17:08.000 And she had artificial legs and I think she could see things that other people couldn't see.
02:17:12.000 Like one day, I mean that was cool when you'd look at that.
02:17:15.000 You're like, wow, look what he could do.
02:17:16.000 Like he got in a fight with Bigfoot on the TV show.
02:17:18.000 It's really stupid.
02:17:19.000 But one day people are going to be given the option.
02:17:24.000 Maybe it's an option now.
02:17:25.000 Do you want to keep your legs or do you want to get these legs that allow you to jump over a building?
02:17:30.000 I'm curious if there's really like superhuman projects that are going on where people actually can have these abilities.
02:17:40.000 We know that with classified information, it's just we know that there's stuff we don't know that are extraordinary projects.
02:17:51.000 So, you know, this being in the future, I feel like there's a disconnect between the state of technology on the planet Earth right now with, like, what the public has access to, with what the, you know, black projects have access to.
02:18:06.000 And that is really not cool because it's not fair for humanity to not understand what is going on.
02:18:18.000 I think that's true, but I also think that most of the state-of-the-art stuff is peer-reviewed, right?
02:18:26.000 I mean, there's so many different people working on these different technologies, like CERN. They're working on the Large Hadron Collider or anything else.
02:18:33.000 There's so many different people working on it.
02:18:36.000 The people that are at the forefront of the technology, unless they're all gobbled up by the dark government, You know, the people at the head of the line kind of understand where the technology is at currently.
02:18:46.000 For sure, for you and I, we don't know what the fuck's going on.
02:18:50.000 But I think you're right.
02:18:51.000 I think there's probably some government programs where they scoop up the wisest and the brightest.
02:18:57.000 And, you know, they got Oppenheimer, you know, and got him to develop the Manhattan Project.
02:19:03.000 There's probably some shit going on right now.
02:19:04.000 What do you think's happening?
02:19:07.000 What do you know, Bill?
02:19:08.000 Tell me.
02:19:09.000 I want there to be huge Freedom of Information Act reform.
02:19:17.000 We know there are trillions going into the black budget.
02:19:20.000 So Trevor Paglin wrote a cool book called, I think it was him, called Blank Spots on the Map.
02:19:27.000 And it just talks a lot about the black budget.
02:19:31.000 So we know it exists.
02:19:33.000 We know...
02:19:35.000 I don't know.
02:19:36.000 But it's holding us back.
02:19:39.000 But maybe – I'm not saying everything should be shared because what if you have like a bioweapon?
02:19:44.000 Right, right.
02:19:45.000 So we need to understand.
02:19:46.000 I think that we need to push the threshold with what the public has access to.
02:19:51.000 Like we need to go way deeper.
02:19:53.000 It's complicated.
02:19:54.000 Yeah.
02:19:54.000 It really is, right?
02:19:56.000 You know, it really is.
02:19:58.000 I really appreciate your perspective, and I really appreciate your point of view, and I really appreciate your ethics and what you're working towards with minds, and that's one of the reasons why I wanted to talk to you.
02:20:08.000 I think it is important.
02:20:09.000 And as much as I fuck around and play devil's advocate, I do that to try to get to, you know, how you're thinking and whether or not you've had these arguments in your own mind.
02:20:18.000 But I think, ultimately, I've said this before, and I don't know if it makes sense, because again, I'm not that smart.
02:20:26.000 I really wonder if there's bottlenecks for progress that we're going to run into.
02:20:34.000 And I think, ultimately, information is one of the big ones.
02:20:39.000 And information also, in a lot of ways, is money.
02:20:44.000 You know, I mean...
02:20:46.000 When we think of money, we're thinking of ones and zeros that are being moved around on bank accounts.
02:20:53.000 It's data.
02:20:55.000 I mean, it's attributed to different people and you get to do more things because you have more of these numbers and more of these things.
02:21:01.000 But what is it really?
02:21:02.000 It's not gold-based anymore.
02:21:04.000 It's not a physical material object that you're coveting.
02:21:08.000 Now it's some weird thing.
02:21:10.000 And it's kind of like information on a database.
02:21:13.000 And what if we get to a certain point in time, and I sort of feel like in this weird, vague, abstract way, we're moving towards this.
02:21:22.000 It's one of the things that when I really step back and wonder about this trend towards socialism and social democratic thinking, I wonder what that is.
02:21:30.000 And I honestly think that we're moving towards this idea that, hey, we've got a lot of fucking problems that could be cured if you move some of that money around.
02:21:41.000 But should you be able to move some of that money around?
02:21:44.000 And what happens if that money becomes something different?
02:21:49.000 What if people start developing social currency instead of financial currency?
02:21:56.000 What if your ability to do things was based on how much you actually put in?
02:22:00.000 I mean, we're assuming, right?
02:22:01.000 We assume that the way we do things now, where if you want to buy a car, you have to have $35,000.
02:22:07.000 That's how much a Mustang costs, and you got to bring it to the bank, and this and that, and you can prove a loan.
02:22:12.000 But what if we get to a time in the future where it's not these pieces of paper that give you material objects, but rather your own actions and deeds?
02:22:22.000 Provide you with social currency that allows you to go on vacations, or allows you to eat at restaurants, or allows you to do things, and there's this running tally.
02:22:30.000 That's not outside of the realm of possibility.
02:22:33.000 No, I think reward systems within everything that we're using are gonna rise up.
02:22:39.000 I mean, that's what we're already kind of doing.
02:22:42.000 I mean, we reward tokens for activity.
02:22:46.000 We're gonna see But what I'm saying is if we're doing it in – if it's a social currency and that your own personal behavior allows you to access more freedoms or more goods or more things,
02:23:02.000 it would encourage people.
02:23:04.000 Positive behavior and community-based behavior because that would be the only way to advance.
02:23:10.000 I mean, obviously this is a long time down the line, but when the first caveman, you know, traded the first fucking shiny rock for the first spearhead, you know, whatever it was that they did that started this whole inevitable trend towards money, This is not something that has to be this way forever.
02:23:27.000 And I wonder, when we're looking at the distribution of information, which is arguably, not arguably, it's never been like what we have today.
02:23:37.000 There's never been a time in human history where everyone had so much access to information that you used to have to pay for.
02:23:42.000 You used to have to go to schools.
02:23:44.000 You used to have to earn your way to the position where you could open the very books that had all this information in it.
02:23:50.000 Now you just get it off your phone.
02:23:52.000 It's instant.
02:23:53.000 And this is a whole different way of interfacing with information.
02:23:57.000 I think this is going to affect higher learning institutes.
02:24:00.000 I think it's going to affect a lot of different things.
02:24:01.000 But I wonder if this all can be applied ultimately someday, maybe not in our generation, but someday to money, that people start using social currency.
02:24:13.000 And that social currency is going to be almost like we have some sort of a database of social currency in this country.
02:24:20.000 A distributed database.
02:24:22.000 Yeah.
02:24:22.000 As long as the government can be running on open systems, I think the reason we struggle with trusting the government to distribute wealth is because it's so inefficient.
02:24:34.000 We want to be deciding where it goes.
02:24:36.000 Well, they're also corrupt as fuck.
02:24:38.000 I mean, there's no doubt about that.
02:24:41.000 I mean, at the end of the day, that's a giant problem, period.
02:24:44.000 If the people that are deciding what we can and can't do with information are also corrupt, which, I mean, there's laws that allow them to be corrupt, but it doesn't mean that they're not corrupt, right?
02:24:57.000 I feel like every politician, the only politicians that I would support at this point, I want to be pulling us in a direction that is making their own position irrelevant.
02:25:09.000 Basically, building open, secure voting systems that allow the planet or the country to decide and vote on what we're doing.
02:25:21.000 I mean, you know, I just think that we need more accurate representation of the consciousness of the communities.
02:25:32.000 And it shouldn't just be these singular people deciding for everybody.
02:25:38.000 We have the tech.
02:25:40.000 Right.
02:25:40.000 And by the time they get in there, they're so compromised by the special interest groups that are helping them out and all the different people that are contributing to their campaign fund.
02:25:49.000 Do you see anybody like that on the horizon?
02:25:53.000 I think that there are – not specifically right now.
02:25:57.000 I don't see anyone talking about open systems and secure voting and completely changing the way that we're making decisions.
02:26:10.000 But I think that's probably just because they don't know about it.
02:26:12.000 I think there would be a lot of politicians who would be okay with that.
02:26:17.000 Or want us to move in that direction.
02:26:18.000 But I think we need more technologists, scientists in these positions building the things that we're using.
02:26:26.000 Yeah, and with an ethic of freedom.
02:26:29.000 Yeah.
02:26:30.000 Yeah.
02:26:31.000 All right.
02:26:32.000 Dude, great conversation, man.
02:26:34.000 I really appreciate it.
02:26:35.000 Tell people how they can get on Mines, how they can check it out.
02:26:40.000 And do you guys have an app as well as...
02:26:43.000 We have an app.
02:26:45.000 You go to Minds.com slash mobile to get the app.
02:26:47.000 We're not on Google Play.
02:26:48.000 We are still in the Apple Store.
02:26:50.000 Google Play won't let you in?
02:26:51.000 No.
02:26:51.000 How come?
02:26:53.000 They're scared.
02:26:54.000 They're scared.
02:26:54.000 Yeah, the nipple.
02:26:56.000 But find me at Minds.com slash opman.
02:26:58.000 Hopefully we'll get you on there.
02:26:59.000 Yeah, well, I'm on.
02:27:00.000 I just...
02:27:00.000 Yeah, I haven't posted anything.
02:27:01.000 All right.
02:27:02.000 But you sent me my account.
02:27:03.000 Yes, thank you.
02:27:04.000 Appreciate it.
02:27:05.000 Let's do it.
02:27:05.000 Thanks, buddy.
02:27:06.000 Thank you.
02:27:06.000 Thanks for coming on, man.
02:27:07.000 It was really fun.
02:27:07.000 I think we got a lot out of it.
02:27:08.000 Thanks.
02:27:11.000 That was great.