The Joe Rogan Experience - March 15, 2022


Joe Rogan Experience #1792 - Daryl Davis & Bill Ottman


Episode Stats

Length

1 hour and 54 minutes

Words per Minute

178.10524

Word Count

20,304

Sentence Count

1,749

Misogynist Sentences

14


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with the founder of Minds, a social network committed to free speech and open source. We talk about the new landscape of alternative networks emerging in the space, the importance of open source, and why open source is the key to the future of the space. We also discuss the Ku Klux Klan and Daryl Roth's recent conversion to Christianity. I hope you enjoy this episode, and don't forget to check out the companion YouTube channel, where more of the same. If you're interested in learning more about alternative networks and alternative social networks, then this is a must-listen episode for you! Thanks for listening, and thanks for supporting the show! -Jon Sorrentino is a musician, writer, podcaster, and podcaster who has played in over 60 countries and travelled the world playing music all over the world. He's also a regular contributor to the New York Times, NPR, and other media outlets. His music has been featured in many publications, and he's one of the most influential publications in the world, including The Financial Times, The Huffington Post, Forbes, Billboard, and The New York Post. The New Statesman, and many other publications. He's been described as a "rock god" and a "singer-songwriter." and "songwriter-in-chief". - he's also known for his guitar playing, singing, writing, and being a lot of music, he's a great conversationalist. - He's music, and his music is widely appreciated by the masses. . And he's an even better than you can find him on social media, too. His music is great, and you should listen to his music on YouTube, too, if you don't know who he is. Thank you for listening to this episode. I really enjoyed it. -- it's a must listen! --Jon Rogan Podcast by day, Jon Rogan podcast by night, all day all day long. Joe Rogans Podcast by night all day, by night? -- by night by day and night by night and night night, by day by day -- thank you for having him back? thank you so much for listening -- Thank you, Jon, I love you for being here? and I'm so grateful for you?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 And we're up.
00:00:13.000 Gentlemen, what's happening?
00:00:14.000 Good to see you.
00:00:14.000 Hey, you're happening, man.
00:00:15.000 Good to see you again.
00:00:16.000 Thank you for having me back.
00:00:17.000 Beautiful purple shirt.
00:00:18.000 I love it.
00:00:18.000 Thanks for having us.
00:00:19.000 And thank you.
00:00:20.000 And Bill, first of all, tell me what's going on with Minds.
00:00:23.000 Minds is one of the first that I was aware of, like alternative social media networks that was committed to free speech.
00:00:33.000 How's it going?
00:00:34.000 It's going.
00:00:35.000 I mean, there's sort of a whole landscape of alternative networks emerging.
00:00:40.000 And so you've got this spectrum of apps where you've...
00:00:43.000 Like, I think of...
00:00:45.000 I put everything through a litmus test when I'm thinking of an alternative network.
00:00:49.000 Basically, is it transparent?
00:00:51.000 Does it publish their source code?
00:00:52.000 Most of these alternative apps, I don't need to name names, but I could.
00:00:57.000 They don't publish their source code.
00:00:59.000 So you can't look at the algorithms to see what's happening.
00:01:02.000 You can't see if there's spyware in there, if they have Google Analytics, little nasty stuff.
00:01:07.000 So you're talking about Getter.
00:01:08.000 Getter.
00:01:09.000 Yeah.
00:01:10.000 Because I've found out that...
00:01:12.000 Parlor, Rumble...
00:01:14.000 All of them?
00:01:15.000 I'm not trying to trash these people.
00:01:17.000 I think that the free speech stuff is good.
00:01:19.000 But some of their terms aren't even free speech.
00:01:22.000 So, you know...
00:01:25.000 Free speech policy is essential.
00:01:27.000 So I absolutely respect any network that is putting forward a free speech policy.
00:01:32.000 But if you can't have free speech policy with sketchy algorithms and closed source code because then we don't know if you're soft censoring, shadow banning.
00:01:44.000 We don't know what's happening in the news feed behind the scenes.
00:01:47.000 Right.
00:01:47.000 Which we definitely know Facebook does, Instagram does, Twitter does.
00:01:51.000 That's all real.
00:01:52.000 Right.
00:01:52.000 So then you've got, are they privacy-focused, end-to-end encrypted?
00:01:56.000 Do they have access to the content of your messages?
00:02:00.000 So we use an end-to-end encrypted messenger protocol called Matrix, so that we don't even have access to people's conversations.
00:02:07.000 Like, I don't want access.
00:02:09.000 Right.
00:02:10.000 And then you've also got, you know, do they pay creators fairly?
00:02:15.000 So you've got these check marks that you go through with each, but open source is key.
00:02:19.000 The future, there is nothing without open source.
00:02:22.000 Any app, if they're claiming to be an alternative and they're not open source, they're not in the same conversation.
00:02:28.000 It's a completely different animal and they should not be taken seriously because they're not being transparent with the world.
00:02:36.000 And then you get into decentralization and actually building an app that...
00:02:40.000 So Google says, don't be evil.
00:02:42.000 But it really can't be evil.
00:02:45.000 We want to make it impossible for us to even take down our network at all.
00:02:50.000 And that's why immutable distributed systems like blockchains and TOR and all of the IPFS, all of these different decentralized systems are emerging.
00:03:03.000 And we're interacting with them.
00:03:05.000 We're not fully decentralized yet.
00:03:06.000 But there's like a progression that...
00:03:10.000 A lot of apps in the Web3 slash decentralized web space are moving towards.
00:03:16.000 Okay.
00:03:17.000 And so, Daryl, to fill people in on you, you've been on the podcast before and you have an incredible history.
00:03:22.000 You're a brilliant musician and you have personally converted.
00:03:27.000 What is the number now?
00:03:28.000 It's more than 200. Ku Klux Klan members, neo-Nazis.
00:03:33.000 I mean, we talked about these guys giving you their Klan outfits and retiring because they met you.
00:03:41.000 And just because you had reasonable conversations with them made them realize how stupid these ideologies that they had somehow or another been captivated by.
00:03:51.000 I mean, at the end of the day, you know, a missed opportunity for dialogue is a missed opportunity for conflict resolution.
00:03:57.000 It's as simple as that.
00:03:58.000 But it's not just having a dialogue or a conversation or a debate.
00:04:02.000 It's the way that we have it, how we communicate, you know, that makes it effective.
00:04:07.000 For example, I've been to 61 countries on six continents.
00:04:11.000 I've played in all 50 states.
00:04:13.000 So all that is to say that I've been exposed to a multitude of skin colors, ethnicities, religions, cultures, ideologies, etc.
00:04:20.000 And all of that has shaped who I've become.
00:04:22.000 Now, all that travel does not make me a better human being than somebody else.
00:04:26.000 It just gives me a better perspective of mass humanity.
00:04:32.000 And what I've learned is that no matter how far I've gone from our own country, Right next door to Canada or Mexico or halfway around the globe, no matter how different the people I encounter may be, they don't look like me, they don't speak my language, they don't worship as I do or whatever, I always conclude at the end of the day that we all are human beings.
00:04:50.000 And as such, we all want these same five core values in our lives.
00:04:55.000 Everybody wants to be loved.
00:04:56.000 Everybody wants to be respected.
00:04:59.000 Everybody wants to be heard.
00:05:01.000 We all want to be treated fairly, and we all basically want the same things for our family as anybody else wants for their family.
00:05:07.000 And if we learn to apply those five core values when we find ourselves in an adversarial situation or a culture or society in which we're unfamiliar, I can guarantee that the navigation will be a lot more smoother.
00:05:21.000 And essentially, that's what's happening here at Mines.
00:05:23.000 We're allowing people to be heard.
00:05:25.000 We're showing them that kind of respect.
00:05:26.000 We don't have to respect what they're saying, but respect their right to say it.
00:05:31.000 And we provide that platform because, you know, when you don't do that, you're driving people to a platform that will embrace them.
00:05:39.000 And then it becomes an echo chamber.
00:05:41.000 And essentially, it could become a breeding ground for a cesspool of nefarious activities, whether it's extremism or violence or conspiracy theories or what have you.
00:05:50.000 So it seems like there's an issue with many social media companies where they want to censor bad ideas.
00:05:58.000 It seems to me that part of that is because the work involved in taking a person who's a neo-Nazi or Ku Klux Klan member and showing them the error of their ways, allowing them to spread their nonsense, and then slowly but surely introducing them to better ideas,
00:06:17.000 it's Exhausting.
00:06:19.000 They're not willing to do the work.
00:06:21.000 Exactly.
00:06:22.000 So what Twitter does is like, fuck you, get out of here.
00:06:24.000 What Instagram does, the same thing with all these people.
00:06:26.000 But the problem with that is then it goes further and further and further and further down where you're getting rid of people for just not agreeing with you.
00:06:34.000 So this is empirical now.
00:06:36.000 So Daryl and I just wrote this paper called The Censorship Effect, along with Jesse Morton, Justin Lane, Leron Schultz, and my brother Jack, and the multiple PhDs, like serious research has gone into this.
00:06:52.000 Even the left out of outlets like Vox are now admitting that deplatforming This is being admitted across the board.
00:07:22.000 I mean, because these are very smart people that work at big tech sites.
00:07:25.000 They know about data science.
00:07:27.000 They know the spread of information.
00:07:29.000 I don't think they're intentionally causing it.
00:07:31.000 I think, first of all, there's an ideology that is attached to all the big tech companies, whether it's Google or Facebook or Twitter.
00:07:40.000 You have to be what they think is woke, right?
00:07:43.000 You have to subscribe to a certain line of thinking And anybody that deviates from that line of thinking should be suppressed, minimized, or banned.
00:07:54.000 So how is that not intentional?
00:07:55.000 But it's not intentional, meaning they're not trying to radicalize people.
00:07:59.000 That's not what they're trying to do.
00:08:01.000 They're just foolish in their approach.
00:08:04.000 I think some of their data science researchers do know.
00:08:08.000 Yeah, but they're not getting to the people that are the CEOs.
00:08:10.000 No, they're not.
00:08:11.000 The CEOs have to virtue signal.
00:08:12.000 All the people that are executives have to virtue signal.
00:08:15.000 And they have to say, we're doing our best to stop harmful talk.
00:08:18.000 But what they call harmful, a lot of it is disagreeing with pharmaceutical companies, which is just fucking crazy.
00:08:26.000 These are the lyingest liars that ever lied.
00:08:29.000 Did you see Zuck on Lex's show?
00:08:32.000 Yes, I did.
00:08:33.000 What did you think?
00:08:35.000 You know, it's hard.
00:08:36.000 It's hard because that guy has an enormous responsibility.
00:08:41.000 He's the head of this insanely huge platform that covers the entire planet Earth.
00:08:46.000 And everything he says has to be measured.
00:08:48.000 It's like you ever see him drink water?
00:08:50.000 He drinks water like this?
00:08:51.000 Like a weird way of drinking water.
00:08:53.000 He doesn't fucking drink the water.
00:08:54.000 He like sips it, it touches his lips, and then he's done.
00:08:57.000 He's like, everything is like measured, measured.
00:09:00.000 Like, I can't imagine trying to speak freely when you're the CEO of Facebook.
00:09:05.000 I think it's almost like pointless to talk to him in that sort of circumstance.
00:09:09.000 Well, you know, to your point, About, you know, people doing this and defending it and so forth and so on.
00:09:15.000 I mean, I think the quote by Upton Sinclair comes into play.
00:09:18.000 I think he said something to the effect of, it's difficult for a man to understand something when his salary depends upon him not understanding it.
00:09:27.000 Yes.
00:09:27.000 Yes.
00:09:28.000 Yeah.
00:09:28.000 And if you live in that world, if you live in that tech world, and I have many friends who have, you know, they're executives at these places.
00:09:35.000 That is just the fucking doctrine.
00:09:38.000 You have so many employees that they have these radical ideas about what you're supposed to do and not supposed to do, and what you're supposed to platform and not platform, and this idea of platforming people.
00:09:51.000 I have people on this podcast all the time that I don't agree with at all, or I agree with them very little, and I want to see what's going on in their head, and I'll get that.
00:10:01.000 You're platforming these people.
00:10:03.000 You're platforming a bad person.
00:10:05.000 I don't think they're bad people.
00:10:06.000 I just don't agree with them.
00:10:07.000 And they have a right-wing ideology that I don't think should be suppressed.
00:10:13.000 I think you should try to pick it apart.
00:10:15.000 You cannot change someone's mind if you do not platform them.
00:10:18.000 It is impossible for someone with horrible ideology to change.
00:10:24.000 I should say, not just a right-wing idea.
00:10:27.000 There's a lot of people with left-wing ideologies that I think are ridiculous, and I want to pick those apart, too.
00:10:32.000 I want to have conversations with people, and this idea that you're only supposed to have conversations with people that you absolutely agree with, and that what you're doing is just broadcasting these ideas to better humanity.
00:10:43.000 If you want a better humanity, have fucking conversations with people.
00:10:46.000 Look, you know, this goes all the way back, I mean, centuries, even back to BC, as in before Christ, right?
00:10:53.000 I mean, we can go back as far as, let's just say, Copernicus, the astronomer who passed away in 1543, okay?
00:11:03.000 Up until then, the belief was that we are a geocentric model universe.
00:11:09.000 Disinfo.
00:11:10.000 Huh?
00:11:10.000 Sorry.
00:11:11.000 No, I was saying that it would have been called disinfo.
00:11:13.000 Okay, yeah, exactly.
00:11:15.000 So even the Catholic Church endorsed that we are a geocentric model universe, meaning that the earth is the center of the universe and everything revolves around us, right?
00:11:25.000 Right.
00:11:25.000 And Copernicus said, no, we're just another planet.
00:11:28.000 The sun is the center of the universe and everything revolves around the sun, which makes it a heliocentric model.
00:11:36.000 And everybody scorned him, ridiculed him.
00:11:39.000 A hundred years later...
00:11:42.000 Galileo came along and built upon Copernicus' theory and developed it even further and said, yes, we are a heliocentric model.
00:11:52.000 And he got arrested for heresy against the Catholic Church.
00:11:58.000 But guess what?
00:12:00.000 He was right.
00:12:01.000 He was right.
00:12:02.000 So, you know, sometimes we have to stand up to the masses, not just join in because everybody else thinks this way.
00:12:09.000 And it's also the problem of the walled garden, right?
00:12:12.000 There's a lot of people that get booted from these social media platforms, whether it's Twitter or Facebook, and then they look at that and they look at those people with further and further disdain and it separates them.
00:12:24.000 From whoever's there.
00:12:26.000 And we're not even just talking about radical people.
00:12:29.000 One of the things that really alerted me to how crazy the censorship shit was was Brett Weinstein had a group that he put together called Unity 2020. And the idea was to bring people that were from the left...
00:12:42.000 That were really reasonable, and from the right that were really reasonable, that weren't captured by corporate greed, and to have them as an alternative candidate.
00:12:50.000 Like, instead of saying, like, you have to be a Republican, or you have to be a Democrat, let's, like, get reasonable left-wing and right-wing people that can agree on a lot of stuff and have them work together, and maybe you have a candidate that's, like, a vice president and a president, one's right-wing, yeah, like, it would be a great way to sort of,
00:13:06.000 like, come together in the middle.
00:13:08.000 Twitter banned the account.
00:13:10.000 Twitter banned an alternative account.
00:13:13.000 There was nothing unreasonable about what they were saying.
00:13:17.000 It was all just conversations with people that are brilliant that happen to be left-wing and brilliant that happen to be right-wing.
00:13:24.000 Let's get them together and see if we could lead this country in a better direction than having this polarization of right versus left where people get super tribal about it.
00:13:32.000 Like, this would be a great way to meet in the middle.
00:13:34.000 And Twitter was like, fuck you.
00:13:35.000 And they banned the account.
00:13:37.000 They had such good intentions.
00:13:38.000 We still do.
00:13:39.000 But the idea that you can get banned for trying to come up with another political party, are you saying that this system is infallible?
00:13:48.000 This right versus left system of blue and red is infallible?
00:13:51.000 That's so crazy!
00:13:53.000 We are here because someone didn't like what was going on in Europe in the 1700s and they took a chance on starting a new system.
00:14:02.000 A system of self-government that was a complete experiment.
00:14:05.000 And it had never been done before in the world.
00:14:07.000 And that created the United States.
00:14:09.000 And the idea that you, the fucking tech dorks, are going to step in and say, no, this is dangerous thinking.
00:14:17.000 Yeah.
00:14:17.000 Oh, the battle-tested First Amendment.
00:14:20.000 Hundreds of years of precedent, legal precedent.
00:14:23.000 Talk about a good content policy.
00:14:26.000 The First Amendment.
00:14:27.000 But it doesn't apply.
00:14:29.000 They say it doesn't apply because this is a private company.
00:14:30.000 They think that their lawyers are better at drafting healthy conversation than the First Amendment.
00:14:35.000 And that's just not true.
00:14:37.000 I think there was a real concern in the early days of Twitter and of social media where a lot of these people that were outrageous right-wing people We're starting to get a lot of attention, like Milo Yiannopoulos was a big one,
00:14:54.000 Gavin McGinnis, and a lot of these guys, they were getting a lot of attention, and the response from the left was like, no, no, no, no, silence them!
00:15:03.000 Like, I heard this one woman talking about her kid is listening to Ben Shapiro, and I would love to get Ben Shapiro removed from all platforms.
00:15:10.000 Oh, Kara Swisher.
00:15:12.000 I think that's who, she's a Vox reporter.
00:15:14.000 And Vox is interesting because they're like smart people, but they're also, you know, they sort of embody this...
00:15:20.000 Their recent article, I don't know, Jamie, if you could find it, it's like, does de-platforming work out of Vox?
00:15:26.000 And they...
00:15:27.000 They don't need to find that.
00:15:28.000 Well, I'm just saying...
00:15:29.000 No, no, no, no, but the reason I was so happy was because they're referencing similar studies that we reference in our paper.
00:15:36.000 They're starting to be forced to acknowledge that the censorship is having serious negative consequences.
00:15:43.000 It polarizes this country.
00:15:45.000 And so what you were saying before about, you know, people, their beliefs being reinforced after they get banned.
00:15:51.000 You know, they're victims.
00:15:52.000 Now they believe the thing that they were ranting about.
00:15:55.000 So in the literature, it's called certainty, level of certainty.
00:15:59.000 That's what's measured.
00:16:01.000 And it's clearly shown that certainty accelerates with deplatforming based on whatever you were thinking before.
00:16:10.000 So isolation and certainty have an overlap.
00:16:14.000 Yeah, so if you have an idea, like especially with something as innocuous as Unity 2020 or beneficial, the idea of unity, I mean, come on, it's like literally in the title.
00:16:23.000 That's what we're all hoping for.
00:16:25.000 We're united as a community, the United States of America, all these different ideas.
00:16:29.000 Let's work together.
00:16:30.000 No.
00:16:31.000 Fuck you.
00:16:32.000 You're not a right wing.
00:16:33.000 You're not a left wing.
00:16:33.000 You can't be a part of the problem because you're going to draw votes away from the people that we think it's imperative that they win.
00:16:40.000 So it changes the whole idea of what democracy is because they're kind of admitting that they have an influence on the way our elections go.
00:16:49.000 You know, I mean, and speaking of unity, you got those people who are out protesting every day, you know, to help change and bring people together, but a lot of them are the very ones who will not sit down and talk with the person that they're protesting against, you know?
00:17:05.000 So how badly do they really want unity?
00:17:08.000 Well, this happened to us, personally.
00:17:10.000 Well, it's happened in universities.
00:17:11.000 That's where it happened first.
00:17:13.000 I started seeing it, I mean, I guess it was a couple of decades ago.
00:17:17.000 You started to see when someone was a controversial speaker, they would come to a university, and instead of someone debating that person or someone, you know, listening to that person's ideas and picking them apart, Instead, they were like pulling fire alarms and shouting people down and screaming at the top of their head in the middle of the auditorium.
00:17:36.000 They're silencing people's ideas because they feel that their ideas are better, which is exactly the opposite of what the Founding Fathers were trying to sort of manage when they came up with the First Amendment.
00:17:51.000 We're really trying to make this less of an emotional debate because I think the censorship and speech stuff is obviously very emotional.
00:17:58.000 We're talking about hate speech.
00:17:59.000 We're talking about a lot of horrible stuff that hurts people personally.
00:18:02.000 And so the big tech strategy is, oh, we care about people's feelings and we want to...
00:18:08.000 Hide this information because it's offensive, but we need to remove the emotion and look at this empirically in terms of what is actually making society more healthy and what is actually preventing radicalization and violent extremism.
00:18:23.000 There's a difference between radicalization and violent extremism.
00:18:26.000 So, if we can prove to big tech that deplatform—we want them to adopt a free speech policy.
00:18:32.000 I think that's the goal here.
00:18:33.000 We don't expect that Facebook and Google are going away.
00:18:35.000 It's not going to happen.
00:18:36.000 There's going to be no MySpace of Facebook and Google.
00:18:39.000 They are embedded in the infrastructure of the planet.
00:18:43.000 I think?
00:19:01.000 Let me ask you this, like for Mines, like say if someone starts like a neo-Nazi group and they start posting on Mines and they start talking about the master race and eliminating Jews and crazy Nazi type shit, what do you do?
00:19:17.000 Oh, I mean, as long as it's not calling for violence or having true threats of violence, then it will go under an NSFW filter.
00:19:26.000 So it will go under sort of a, you know, it'll have a sensitive kind of click through.
00:19:31.000 So you'll be warned.
00:19:32.000 Before you're seeing that.
00:19:34.000 So it's like one of those Instagram videos, like if you see a car accident or something like that, there's Instagram videos you have to say that you're willing to see this offensive thing.
00:19:43.000 Exactly.
00:19:43.000 So we have tags so that people...
00:19:46.000 We don't want anyone to see stuff they don't want to see.
00:19:48.000 So what if someone doesn't use the tags?
00:19:50.000 Then it'll get reported and get tagged.
00:19:53.000 Okay.
00:19:53.000 So it's like if someone starts posting Nazi propaganda, they just immediately...
00:19:58.000 Like someone reports it?
00:20:00.000 Yep.
00:20:00.000 And we also have a jury system.
00:20:02.000 So we're rolling out this system where the community can sort of help create consensus around tags on different content.
00:20:08.000 And, you know, if we make a mistake, it can get appealed.
00:20:11.000 And the community actually votes, not us.
00:20:14.000 And Darrell, your take on this is like, how do you think that a social media company like Twitter, something that's really huge, can pivot from the model that they have now, where they just ban people?
00:20:29.000 Because, you know, that points to them assuming that the majority of people out here are stupid and that these companies need to tell you what to believe, which to me is offensive.
00:20:42.000 It is offensive, yeah.
00:20:44.000 I believe, you know, yes, there's a lot of bad information out there.
00:20:48.000 And, you know, the more liberal you make your platform, allowing anybody and everybody to come in, yeah, you're going to have some bad actors, sure.
00:20:54.000 But the way you address it is you combat bad information by providing more good information.
00:21:01.000 Yeah, well, that's the age-old idea.
00:21:04.000 So Clarence Thomas, Supreme Court Justice, came out and he said that he thinks Networks above a certain size should be considered common carriers.
00:21:15.000 There's this whole debate about Section 230 and whether networks have a right to take things down.
00:21:20.000 It's pretty definitive that big social networks, private companies do have the right to moderate.
00:21:24.000 That's a fact.
00:21:25.000 Section 230 doesn't say you have to keep up everything.
00:21:28.000 But the common carrier, like a phone company, can't ban you for your views.
00:21:33.000 So they're common carriers, and that's an important distinction.
00:21:36.000 I think that's a rational suggestion from Thomas, that once you reach a certain size, you cannot just be going and playing favorites.
00:21:44.000 Yeah.
00:21:45.000 Yeah, I know Jack Dorsey had an idea of two versions of Twitter, a curated, moderated version of Twitter and then a Wild West version.
00:21:53.000 And he was trying to pitch that and I think they shot him down.
00:21:57.000 But his idea was like we should have some Twitter that's like got some sort of decentralized control where it's not up to the moderators to decide what's on it and people can just put on whatever they want.
00:22:11.000 Yeah, he launched a project called Blue Sky, which is sort of a research initiative into decentralized social media, kind of very much in our space.
00:22:19.000 Before or after he left?
00:22:20.000 Before he left.
00:22:21.000 And then he left, like, two days after he left, there was this huge censorship issue where they said, oh, you can, if it's a private image, it can get taken down on Twitter.
00:22:30.000 So, like, any private image of anybody.
00:22:32.000 Oh, after they left, after he left, they ramped up censorship in a big way.
00:22:36.000 Yeah.
00:22:36.000 And it seems like, I mean, it's a hard position to be in because, you know, it's like your baby.
00:22:41.000 This is a company he's been working on forever and he doesn't want to badmouth it.
00:22:44.000 But I would not be at all surprised if there were some internal wars happening about, I mean, there's a huge wired piece about internal free speech wars in Twitter management.
00:22:54.000 So it's a fact that it's not, you know, one single ideology in these companies.
00:23:00.000 There's definitely an overwhelming ideology, but I think that there is starting to be pushback.
00:23:05.000 So that's positive.
00:23:06.000 Yeah, there's some intelligent people that realize the error of their ways and that this whole thing is going in a negative direction.
00:23:11.000 And Daryl, how did you get involved with Bill& Mines and, like, what was your idea going into this?
00:23:18.000 Well, Bill had contacted me after seeing me on some interview or reading about me or something to participate in an event he was originally going to have in New Jersey, then it got moved to Philadelphia.
00:23:29.000 How long ago was this?
00:23:30.000 Oh, what, five, six years ago?
00:23:31.000 I think, no, like 2019. Before the pandemic?
00:23:36.000 Before pandemic, yeah.
00:23:37.000 Yeah, pre-pandemic.
00:23:38.000 So 2019. And I liked what he was talking about.
00:23:42.000 All different people from different political backgrounds, you know, stations in life, whatever, coming together.
00:23:50.000 And so I said, yeah, count me in.
00:23:52.000 And I went and did it.
00:23:54.000 And he had everybody there from all different walks of life.
00:23:57.000 We all got along.
00:23:58.000 We had different views.
00:23:59.000 We talked together.
00:24:00.000 We listened to each other's presentations.
00:24:03.000 And then we had an after party together where everybody just kind of let their hair down, all that kind of stuff.
00:24:07.000 The only people who were not supportive were the protesters across the street, some of whom called me a white supremacist.
00:24:17.000 Yeah, I think Melissa Chen talked about this on your show a while back.
00:24:20.000 But basically, Antifa was like protesting the event.
00:24:23.000 You know, we had all these big YouTubers, Tim and, you know, people on the left, right?
00:24:29.000 Andy was there, right?
00:24:30.000 Andy was there, yeah.
00:24:31.000 And there were some progressives.
00:24:33.000 Tim and Andy.
00:24:34.000 Say their last names.
00:24:35.000 Sorry, Andy, no.
00:24:36.000 Tim Pool.
00:24:36.000 But we also had some...
00:24:39.000 There were some leftists there as well.
00:24:42.000 And...
00:24:43.000 We really did our best to make it as balanced as possible.
00:24:47.000 And communists and capitalists and- And the protesters were like, you shouldn't be communicating with each other.
00:24:53.000 But the Antifa protest, I mean, it's like, are they even real people?
00:24:57.000 It's like, I guess they are.
00:24:58.000 I'm joking.
00:24:59.000 But it's like- What are we doing when you're allowing these people to dictate?
00:25:05.000 They're so crazy.
00:25:07.000 And we're allowing them to dictate what is and isn't said based on the threats of violence and lighting buildings on fire and shit?
00:25:15.000 They got us deplatformed from the original theater that we were going to have it in.
00:25:19.000 So we had to move to Philly.
00:25:20.000 How is that possible?
00:25:21.000 We sold out.
00:25:22.000 Theater?
00:25:22.000 Yeah.
00:25:23.000 Fear, yeah.
00:25:24.000 Fear of repercussions.
00:25:26.000 I do need to say one thing.
00:25:28.000 So I mentioned Jesse Morton, who is one of the co-authors of the paper.
00:25:32.000 So he actually, a good friend of Gerald's of mine, recently passed away.
00:25:36.000 He's a former extremist.
00:25:38.000 And he is leading in the de-radicalization space.
00:25:43.000 He actually was an Al-Qaeda propaganda lead.
00:25:47.000 So he ran a propaganda site for Al-Qaeda.
00:25:49.000 He went to Columbia.
00:25:50.000 He was doing this in New York City.
00:25:51.000 So he's from the U.S. And, you know, Daryl, maybe you can expand.
00:25:56.000 Oh, I know who that gentleman is.
00:25:57.000 You do?
00:25:58.000 Okay.
00:25:58.000 I was looking to have him on my podcast, and he passed away.
00:26:02.000 Yeah, this was one of his last big projects.
00:26:05.000 Daryl's actually out in Portland right now.
00:26:08.000 I'm sure you guys know about Majid, who's one of the best examples of that.
00:26:13.000 Someone who was radicalized.
00:26:15.000 Even the name of his podcast is Radical.
00:26:18.000 His book's Radical.
00:26:20.000 He was a guy who was deeply embedded in this sort of Islamic group.
00:26:25.000 And then went to jail and realized why he was in jail, started reading, sort of examining this thought process, and came out of it this sort of brilliant mind to analyze what's the process where people get radicalized?
00:26:43.000 How does this happen?
00:26:44.000 And he could say it from—first of all, he's incredibly articulate, so he could say it from this way.
00:26:49.000 He's coming from this place of, I was this guy.
00:26:52.000 Instead of, I know what's wrong with these people, I was these people.
00:26:55.000 I am evidence.
00:26:56.000 Yes, exactly.
00:26:58.000 And that's what, you know, and that's what Mines has in terms of doing the research.
00:27:01.000 You know, we've done like a polymath, a 360, digging from all different genres, psychologists, former extremists, trolls, all kinds of people, people like myself with boots on the ground dealing with current extremists and things like that.
00:27:17.000 So all of that comes, you know, into the conclusion of this paper.
00:27:21.000 Unlike a lot of other papers, they talk about why people do this.
00:27:26.000 Others talk about the effect of what they've done.
00:27:29.000 Some talk about the cause and the effect.
00:27:31.000 But we have the cause, the effect, and the solution.
00:27:35.000 It's just hard to get people to jump onto a new social media network.
00:27:39.000 That seems to be a real issue.
00:27:41.000 Human beings are creatures of habit, not change.
00:27:44.000 Yes.
00:27:44.000 And you get, if you're used to checking Facebook, oh, let's see what grandma posted.
00:27:48.000 You're used to doing that, and this is your go-to thing, and you only have so much time in the world.
00:27:53.000 It's hard to get someone to deviate from that, right?
00:27:55.000 There's no rush.
00:27:56.000 We're seeing huge growth just naturally.
00:27:59.000 How many people do you have?
00:28:00.000 More than five million.
00:28:01.000 Five million?
00:28:02.000 Yeah.
00:28:02.000 Wow.
00:28:03.000 And Majid's on there, actually.
00:28:05.000 Minds.com slash Majid.
00:28:06.000 Of course he is.
00:28:07.000 Yeah, he just signed up.
00:28:08.000 And so, you know, it's all just long-term, like, thinking, where are we actually headed?
00:28:14.000 Where are we going to be in 10, 20 years?
00:28:17.000 Like...
00:28:18.000 You don't...
00:28:19.000 And also, it makes it harder to grow for what you said.
00:28:23.000 People are just stuck in their ways.
00:28:25.000 But also, Facebook and Google use the dirtiest tricks in the book to grow.
00:28:29.000 I mean, they literally latched their tentacles into everybody's phones, grabbed all their contacts, like, you know, followed you in your browser.
00:28:37.000 Like, every surveillance tactic they could get to grow.
00:28:40.000 Explain that.
00:28:41.000 What do you mean?
00:28:42.000 So there are these sort of dark growth hacking tricks that a lot of apps will use to increase their user base.
00:28:51.000 And it's basically like manipulative growth techniques to get people to give them more information than you otherwise would.
00:28:59.000 So let's just say I'm a person who's never used Facebook before and I just got a new phone and I said, you know what, I'm going to download Facebook.
00:29:07.000 What happens?
00:29:08.000 Oh, you know, they take you through their nice onboarding flow, super slick UX because they have brilliant designers.
00:29:14.000 What's UX? User experience.
00:29:16.000 Okay.
00:29:16.000 So, you know, you just keep pressing that big blue button.
00:29:20.000 Yep, yep, yep, yep.
00:29:22.000 Oh yeah, agree to terms, yep.
00:29:24.000 And so they just put it, you know, they make it very subtle what you're doing.
00:29:32.000 And there are benefits.
00:29:33.000 But what is happening?
00:29:34.000 They're grabbing all of your contact book.
00:29:37.000 They're grabbing your location.
00:29:38.000 So they grab all your phone numbers?
00:29:40.000 All your phone numbers.
00:29:41.000 So when you sign up for Facebook, it has access to all of the phone?
00:29:45.000 Yes.
00:29:45.000 If you give it to them.
00:29:46.000 If you give it to them.
00:29:47.000 You can say no.
00:29:48.000 Twitter is doing this.
00:29:50.000 They won't stop it.
00:29:52.000 You say no and then it shows up in your feed another prompt to do it.
00:29:56.000 So are they getting the full contact with the names and everything?
00:30:00.000 Or are they just getting the phone numbers?
00:30:02.000 It depends on the specific app and they all kind of have different levels of invasiveness.
00:30:08.000 And how many people read the entire policy agreement?
00:30:11.000 How about zero, right?
00:30:12.000 Exactly.
00:30:13.000 Who the fuck's reading that?
00:30:13.000 No.
00:30:14.000 So like if I have you on my phone and I sign up for Facebook, does it get Bill Ottman plus your phone number and then they could target you?
00:30:23.000 Yeah.
00:30:23.000 So they could just send you a text message or...
00:30:25.000 They could.
00:30:26.000 Or they can sync up to your Facebook that you have.
00:30:30.000 Yeah.
00:30:30.000 Because your app, they're aware that we're communicating with each other because we both have each other's phone number.
00:30:36.000 Well, it's like, you know, you go on Facebook and there's some sponsored ad there.
00:30:39.000 And it has a list of your friends that like this ad.
00:30:43.000 Right.
00:30:44.000 He's like, I didn't know she liked that.
00:30:46.000 Well, how do they know she liked it?
00:30:48.000 And how do they know that you might like it because so-and-so liked it?
00:30:52.000 Yes.
00:30:53.000 Let me ask you this because this is a big one that everybody always wants to know.
00:30:56.000 Sometimes we're talking about stuff and then I'll hear an ad.
00:31:00.000 See an ad for the thing we're talking about.
00:31:02.000 Like Jamie and I have talked about this many times where it's like there is no way this random ad would have popped up just on its own.
00:31:11.000 It seems like they have to be listening.
00:31:13.000 So, I wish Lex had asked Zuck this question, because this is a key question that, honestly, I'm not going to claim to know.
00:31:21.000 I mean, we don't have access to their source code, so we do not know.
00:31:25.000 And they've denied it repeatedly.
00:31:28.000 I think that the Geo can trick you a lot of time into thinking that they're listening.
00:31:34.000 And, you know, different associations they're able to make in the back end.
00:31:38.000 But I don't know the answer, but I know thousands of stories like what you're saying.
00:31:44.000 And I feel like they're just skirting around it.
00:31:47.000 And I would not at all be surprised.
00:31:49.000 But again, if we can't see the source code, we do not know.
00:31:52.000 So no one has definitively proven that they're actually listening to you?
00:31:57.000 Well, they are listening because you can say, okay, Google.
00:32:00.000 How does it know that you said, okay Google?
00:32:03.000 Right, or hey Siri.
00:32:04.000 Yeah, so it's definitely listening for certain cues and we don't know the breadth of that.
00:32:11.000 So it could be that it picks up certain words that would indicate products that maybe you'd be interested in buying and they show you those ads.
00:32:17.000 I was looking in my Google data history like a couple weeks ago And I had, you know, I have a bunch of different phones, actually, which I want to show you later, some, like, new open source stuff.
00:32:30.000 Show me now.
00:32:31.000 Okay, just in a sec.
00:32:32.000 So, in my Google data history, it showed when I said certain words, like, that triggered it.
00:32:39.000 Like, there was all these different words that are sort of commands for assisting.
00:32:45.000 Google Assistant, I think it's called.
00:32:46.000 And I had turned Google Assistant off, and yet it still had, it was like on June, or on June 18th, you said, you know, hello, or whatever it was.
00:32:55.000 And it was just this whole history, and I just deleted it all and, like, turned it all off.
00:32:58.000 And it's, um, they're definitely listening for Q. So even if you say no, you opt out.
00:33:03.000 Yeah, I turned it off.
00:33:04.000 Yeah.
00:33:04.000 Wow.
00:33:05.000 Now, as he points out, you know, a lot of them are closed source codes.
00:33:10.000 Mine is open source code.
00:33:12.000 You can get on, you can see it, and take our code and use it.
00:33:15.000 And we want you to.
00:33:16.000 It's available.
00:33:18.000 Yeah, and Mines...
00:33:19.000 Transparency.
00:33:20.000 The way Mines...
00:33:21.000 I've never actually used Mines.
00:33:23.000 I know I have an account over there, right?
00:33:25.000 Yeah, you got it.
00:33:26.000 We'll get you set up again.
00:33:27.000 We'll make it easy so you don't have to do anything.
00:33:29.000 We'll listen in.
00:33:31.000 It's all about making it easy because everyone's so busy.
00:33:34.000 But I don't go on any social media anymore.
00:33:37.000 I post and then I get the fuck out of there.
00:33:40.000 You're an Instagram boy.
00:33:41.000 Yeah.
00:33:42.000 Well, I use Twitter and Facebook too, but I don't use them very much.
00:33:46.000 I use Twitter to see how crazy people are.
00:33:47.000 Like, how crazy are people today?
00:33:49.000 Let me just look through my feed and see who's fucking screaming at everybody.
00:33:52.000 What's going on in Russia?
00:33:53.000 Who believes this is all a conspiracy?
00:33:56.000 Where's the tinfoil hat brigade on this one?
00:33:59.000 Just to get a finger on the pulse.
00:34:01.000 And then with Instagram, I just post.
00:34:04.000 And then on Facebook, Facebook is nonsense for me.
00:34:08.000 I go there for just nonsense.
00:34:10.000 I go there for videos and stuff, and I'm not even remotely looking to engage with people.
00:34:16.000 I feel like the engagement with people, for a person with a profile as big as mine, it's too much work.
00:34:23.000 The interaction with people, it's too toxic.
00:34:27.000 So many people are mad for whatever fucking reason, and it's a lot of bad faith conversations.
00:34:33.000 For my own mental health, I opt out.
00:34:35.000 I think of it as extra distribution outlets.
00:34:37.000 So someone like you, you've got a million things going on, it's just literally impossible to post to multiple places.
00:34:43.000 You don't have time.
00:34:44.000 Is there an app that allows you to post something to a shitload of places?
00:34:49.000 Well, we have auto-importing from YouTube and Twitter.
00:34:52.000 So you can pull your stuff in and just post on your own.
00:34:54.000 So we can maybe get that set up.
00:34:56.000 YouTube as well.
00:34:56.000 So you have a video aspect in mind.
00:34:58.000 Yeah, we do.
00:34:58.000 We support video.
00:34:59.000 We support all multimedia, blogs, Messenger.
00:35:02.000 Do you host that video?
00:35:03.000 Yeah.
00:35:04.000 So what are the implications?
00:35:05.000 Like, what if someone posts something illegal?
00:35:07.000 Yeah, it gets taken down.
00:35:08.000 Okay.
00:35:09.000 Yeah, so we definitely are, like, U.S. law, First Amendment based.
00:35:12.000 Right, right.
00:35:13.000 So just quickly to show you.
00:35:15.000 So you got phones.
00:35:16.000 So these are, this is called a Librem.
00:35:18.000 This is made in the U.S. So this is, like, trying to get rid of conflict minerals.
00:35:23.000 Really?
00:35:23.000 It's very heavy.
00:35:24.000 It's a tank.
00:35:26.000 Does it suck?
00:35:27.000 It kind of sucks.
00:35:28.000 No, no.
00:35:30.000 They're a great team, honestly.
00:35:31.000 It's not an easy project.
00:35:33.000 It's amazing for how hard of a project it is.
00:35:37.000 It does not suck.
00:35:37.000 It's a legitimate effort.
00:35:40.000 What are all these things on the back?
00:35:42.000 These, like, switches?
00:35:44.000 Yeah, don't switch those, because I don't know what they do yet.
00:35:47.000 I just got it, like, two days ago.
00:35:50.000 Does it charge?
00:35:52.000 Is it charged up?
00:35:52.000 Yeah, I'll have to charge it later.
00:35:54.000 Okay, that's the Librem 5?
00:35:55.000 Yeah, yeah.
00:35:56.000 Security and privacy focused phone.
00:35:58.000 You got to go to the USA one.
00:36:00.000 Because I think the important...
00:36:01.000 Made in the USA does matter because you talk a lot about the conflict mineral situation with phones.
00:36:06.000 And I've seen you bring up other phones.
00:36:08.000 There was like the Fair phone.
00:36:09.000 There was like some other attempts at it.
00:36:11.000 What happened, Jimmy?
00:36:14.000 So this one...
00:36:16.000 What is this one?
00:36:17.000 That one's called the PinePhone.
00:36:19.000 Yeah, I've heard of this one as well.
00:36:20.000 Yeah, so that's much cheaper.
00:36:22.000 This one's like 2K. Really?
00:36:24.000 Yeah.
00:36:24.000 Why is it so expensive?
00:36:25.000 Because it's a beastly machine.
00:36:28.000 This is a computer.
00:36:30.000 Right.
00:36:30.000 So you can hook this up to a monitor.
00:36:33.000 This is a Linux OS. Yeah, it's got kill switches.
00:36:36.000 That's what those buttons are.
00:36:38.000 Physically disconnected components.
00:36:40.000 And the CIA's like, yeah, yeah, go ahead.
00:36:41.000 That works.
00:36:42.000 Mm-hmm.
00:36:43.000 Use our unique hardware kill switches to physically disconnect Wi-Fi, Bluetooth, cellular signal, microphone, and camera with kill switches.
00:36:52.000 Yeah, because a lot of it has to do with the chain of custody of these products because proprietary surveillance chips will get added to the phone in its life cycle throughout the factories globally.
00:37:04.000 So they're saying, look, we need to make sure to be able to commit to our customers that there's no sketchy chips on this thing that's feeding data to some...
00:37:12.000 Someplace we don't know about.
00:37:14.000 Right.
00:37:14.000 And that was the big thing with Huawei, right?
00:37:17.000 One of the things about banning Huawei in the United States was they had proven that some of their routers were allowing access from third parties to access the information as it's distributed between the two parties.
00:37:30.000 So a third party could come in, scoop up all the intellectual property and just use it.
00:37:37.000 And you know, sometimes some of these companies, they work both sides.
00:37:41.000 So the ones that create the device to prevent something is the same company that creates the device to take something.
00:37:48.000 Like in the Washington, D.C. area, for example, a few years back, D.C. was being sued for the cameras, the red light cameras.
00:37:58.000 You run the red light, you get a ticket.
00:38:00.000 Well, Lockheed Martin had created those cameras, and they were shortening the length of the yellow light So you got a bigger chance of running the red light, all right?
00:38:11.000 So for every ticket that was written, Lockheed Martin was getting a dollar, and the rest of it would go to the D.C. Police Department, right?
00:38:17.000 So dirty.
00:38:18.000 But Hewlett-Packard, you know, the same ones who make the radar gun that you get caught on are the same ones who make the radar detector that we use.
00:38:28.000 So they get money from both ends.
00:38:30.000 Well, it's good business.
00:38:32.000 This one, is this as good?
00:38:34.000 The Pine Phone?
00:38:35.000 Honestly, it's all very early.
00:38:37.000 Yeah, sorry that it's not charged.
00:38:38.000 Have you used any of these?
00:38:39.000 I just...
00:38:42.000 I just picked up these from the- Because if you just use an iPhone, we got a fucking problem.
00:38:46.000 No, I do.
00:38:47.000 I do.
00:38:47.000 I'm using- I got this because the cool thing about the Librem is that you can plug this into a monitor with a keyboard, and this is a computer.
00:38:56.000 Do you have a USB-C charger?
00:38:58.000 You could maybe- I have one.
00:39:00.000 You do?
00:39:00.000 What charger?
00:39:01.000 Yeah.
00:39:01.000 Yeah, if Jamie wants to plug that in, he can plug this in.
00:39:03.000 I'd like to see what that thing's all about.
00:39:05.000 It's on like this?
00:39:06.000 Yes, perfect.
00:39:07.000 The other thing is, you know Adam Curry, right?
00:39:12.000 Yeah, I've seen him.
00:39:13.000 Adam Curry, who is the original podfather.
00:39:17.000 He is literally the man who created the original podcast.
00:39:21.000 Under your leg, you're wrapped up, though.
00:39:24.000 You're wrapped up.
00:39:25.000 I'm wrapped up.
00:39:27.000 See it under the table?
00:39:31.000 Yes, it's actually connected to the table.
00:39:35.000 Adam has this No Agenda podcast, and they have a No Agenda phone, and it's essentially a de-Googled Android phone that removes all of the tracing stuff, all the stuff where, you know, but you can't use navigation on it.
00:39:48.000 There's a lot of shit you can't use on it.
00:39:50.000 Is it based on graphene?
00:39:51.000 I do not know.
00:39:52.000 I think it is.
00:39:53.000 I'm not sure.
00:39:54.000 Jamie, check out a...
00:39:54.000 Go to No Agenda phone.
00:39:55.000 Okay.
00:39:56.000 Sorry.
00:39:57.000 This thing, does this have a navigation?
00:40:01.000 This is not really usable.
00:40:02.000 It's not really replaceable for your standard.
00:40:06.000 This is not Android-based.
00:40:08.000 And neither is PinePhone, actually.
00:40:11.000 This is a no-agenda phone.
00:40:12.000 Small batch, artisan, secure, private.
00:40:14.000 Is it open source?
00:40:16.000 Go to the footer.
00:40:19.000 Go all the way down to the footer.
00:40:21.000 So, typically, scroll up.
00:40:22.000 It looks like it's not...
00:40:23.000 Are they publishing their code?
00:40:25.000 Keep going up.
00:40:27.000 It is Graphene.
00:40:28.000 It's Graphene.
00:40:29.000 That's good.
00:40:29.000 Good.
00:40:30.000 Alright, great.
00:40:31.000 So, this OS is essentially raw AOSP, Android open source projects, some custom bits.
00:40:39.000 If you choose a phone that is supported directly by Lineage and not some random developer on XDA, it is as secure as the G variant.
00:40:47.000 If you choose to build...
00:40:48.000 I don't know what they're saying here.
00:40:49.000 Do you know what they're saying here?
00:40:52.000 I see.
00:40:53.000 So Graphene OS is the most secure option endorsed by Edward Snowden, entirely funded by donations and a guy.
00:41:01.000 What does that mean?
00:41:02.000 What does that mean?
00:41:04.000 How fucking random is that?
00:41:06.000 A guy.
00:41:08.000 The OS is updated and patched more often than G does with every conceivable method of hardening possible.
00:41:14.000 The only downside is casual adopters.
00:41:16.000 There's a relatively limited compatibility layer for apps and access services similar to those provided by G. G must be Google.
00:41:24.000 So would it be less effective if it was entirely funded by donations and a gal?
00:41:30.000 Yeah, or a they.
00:41:34.000 Edward Snowden speaks on it at the bottom.
00:41:36.000 It says that software is equally important.
00:41:39.000 The iOS and Android operating systems that run on nearly every smartphone conceal uncountable numbers of programming flaws, known as security vulnerabilities.
00:41:50.000 That means common apps like iMessage or web browsers become dangerous.
00:41:54.000 You can be hacked.
00:41:55.000 So he uses the Graphene network as his base operating system.
00:42:00.000 Yeah, I think that's a legit project, for sure.
00:42:02.000 Graphene seems like a stripped-down version of Android, which Google created Android.
00:42:08.000 But there's a lot of stuff that you can't use, right?
00:42:10.000 Like navigation.
00:42:12.000 That's a big one for me.
00:42:13.000 Yeah, you can...
00:42:14.000 No, I think you can get probably, like, open street maps and have some very simple navigation.
00:42:20.000 You're going to be lost as fuck.
00:42:20.000 You got a friend in the country, you're not going to find them.
00:42:23.000 Yeah, I mean, look, it comes down to, like, are people willing...
00:42:27.000 What sacrifices are people willing to make?
00:42:30.000 And can we have a reasonable conversation with these companies to find a middle ground?
00:42:34.000 Well, the key would be, then, to have...
00:42:38.000 That graphene phone, right?
00:42:40.000 With kill switches.
00:42:41.000 So you use the navigation when you need it, then kill it.
00:42:44.000 Yeah, I found out that, like, find my iPhone...
00:42:46.000 Doesn't that work even if your phone is off?
00:42:50.000 Yeah.
00:42:51.000 How the fuck is that?
00:42:52.000 Androids don't do that.
00:42:53.000 No?
00:42:54.000 I don't think so.
00:42:55.000 You think Android's better?
00:42:56.000 Well, I'll tell you a quick experience, right?
00:42:59.000 Okay.
00:43:00.000 I was flying with...
00:43:01.000 You won't name the airlines.
00:43:03.000 And I got off the plane...
00:43:06.000 I went to turn on my phone.
00:43:08.000 It wasn't there.
00:43:09.000 So I tried to get back on the plane because I figured it fell down the seat.
00:43:12.000 Oh, you can't get back on the plane, sir.
00:43:14.000 What seat were you in?
00:43:14.000 We'll go look for your phone.
00:43:16.000 Gave them the seat number.
00:43:17.000 They come back five minutes later.
00:43:18.000 We check between the seats, under the seat, in the pocket.
00:43:21.000 No phone.
00:43:22.000 You know, call customer service or email customer service and give them a description.
00:43:26.000 They'll look for it for 30 days.
00:43:28.000 So I go through all that.
00:43:29.000 And every couple of days, they would contact me.
00:43:33.000 Now, I called Verizon.
00:43:34.000 And, you know, has my phone, you know, can you track my phone?
00:43:38.000 Well, we can't track your phone if it's turned off.
00:43:41.000 So they told me I have an Android.
00:43:43.000 Okay, so I kept calling to see if anybody used it, right?
00:43:46.000 And I kept calling my phone and stuff.
00:43:48.000 Nothing.
00:43:49.000 So every couple days, you know, they would let me know, we're looking for your phone.
00:43:51.000 We haven't found anything yet.
00:43:53.000 And then in 30 days, they will stop the search.
00:43:56.000 But if anybody turns it in, whatever, they'll let me know.
00:43:59.000 So, on the 40th day—well, they gave me an email on the 30th day saying, you know, we're sorry we have not located your lost item.
00:44:07.000 However, if anybody turns in, then we'll let you know.
00:44:09.000 On the 40th day, I got an email from them saying, your lost item has been found.
00:44:15.000 It's in the lost—now, I— I left it in the seat in D.C. when I was flying somewhere.
00:44:23.000 And they found the phone, and it's in the lost and found at the Houston airport.
00:44:30.000 Now, that plane flies around the country 10 times a day.
00:44:34.000 It gets cleaned 10 times a day every time people deboard, right?
00:44:37.000 And so people were cleaning that plane for 40 days, and nobody found that phone.
00:44:43.000 It was found deep in between the seats.
00:44:46.000 It'd fall off my hip, right?
00:44:47.000 So the moral of the story is, you know, the best clean planes leave out of Houston.
00:44:53.000 LAUGHTER So you couldn't find it like the way you find an iPhone.
00:44:59.000 So iPhones, apparently there's like some signal that's being sent.
00:45:03.000 Yeah.
00:45:05.000 So that alone is a little bit of a red flag, right?
00:45:09.000 Yeah, I mean, Apple is, they try to have this privacy argument, and it's so shocking to me that they try to push that, like, oh, you know, we're not going to let the FBI in, like, trust us.
00:45:19.000 And look, Apple makes beautiful products.
00:45:21.000 Everybody knows that they're the best designers in the world.
00:45:24.000 What do you use?
00:45:24.000 I use stripped Android, and I'm going to start using this because this can be my computer as well.
00:45:31.000 But I'll use both.
00:45:34.000 I'm playing with all the options right now, but I like to use Linux as my desktop computer.
00:45:40.000 Can I see what you use?
00:45:42.000 That's your phone?
00:45:43.000 This is my phone.
00:45:44.000 This is a, well, this is just Android.
00:45:48.000 That's just regular Android?
00:45:49.000 Yeah, this is just regular Android.
00:45:50.000 It's not even stripped?
00:45:50.000 This is a non-stripped version.
00:45:52.000 So a privacy guy is having a phone that's tracking him everywhere he goes.
00:45:56.000 I have, dude, we're all...
00:45:59.000 In the midst of this world.
00:46:01.000 Look at me.
00:46:02.000 I mean, come on.
00:46:03.000 You've got to give me credit for having five different phones.
00:46:05.000 Do you have different phones because you have to check how Mines is on different operating systems?
00:46:10.000 I'm not doing all of our QA, but I have Linux devices, Windows devices, Apple devices.
00:46:18.000 I use it all.
00:46:20.000 You use Android as your main phone.
00:46:23.000 Yes.
00:46:23.000 Why do you do that?
00:46:24.000 I do that because Android is at least open source in its base function.
00:46:29.000 So over Apple, I will choose Android because like we see with Graphene, you can fork Android and create a stripped-down version.
00:46:37.000 Now, it is...
00:46:40.000 Absolutely imperative.
00:46:41.000 I need to get a graphene, pure graphene version.
00:46:44.000 That's on my list of things to do.
00:46:46.000 I've got, you know, there's a ClearOS here, which is...
00:46:50.000 What's that?
00:46:51.000 That's...
00:46:52.000 What is ClearOS?
00:46:53.000 That's another open source Android.
00:46:57.000 Does this...
00:46:58.000 Ooh, that's pretty.
00:46:59.000 Yeah.
00:46:59.000 Does this have some sort of GPS system?
00:47:04.000 I think it does, yeah.
00:47:07.000 You want to open that so I can check it out?
00:47:09.000 I mean...
00:47:09.000 So, what people are concerned with is obviously someone being able to access their information, someone tracking them.
00:47:21.000 I think people are concerned and like, alright, so look at me for an example.
00:47:25.000 Like, I'm sort of in this privacy world, but I'm also not like a privacy...
00:47:30.000 I'm a privacy maximalist for what I want to be private.
00:47:34.000 I'm not saying, like, I'm never going to use any big tech app, ever.
00:47:40.000 It's just an irrational, impossible mission.
00:47:43.000 I've driven myself crazy, like, thinking that I should do that.
00:47:46.000 You have to go off the grid there.
00:47:48.000 Like, I'm going to be a human, okay?
00:47:51.000 And I'm going to explore all the different options and hopefully transition.
00:47:56.000 So, like, I have gotten rid of most...
00:47:59.000 I don't use big tech...
00:48:02.000 Nearly as much as I used to probably like 10% of what I used to I deleted most of my accounts I'll check in sometimes because I like to see what's going on and to understand the market but you know I it's I'm not gonna I need to get around to with maps and so I'm gonna as soon as we have an alternative I will do it I'll be the first one in line when someone can put something I'm trying to get all these options in front of me,
00:48:24.000 but it seems like Operating systems and applications, the trend is for them to get more intrusive, right?
00:48:33.000 Like TikTok is supposedly, they back-engineered it and said it's the absolute worst software that they've ever examined in terms of, like, violating your privacy.
00:48:43.000 Yeah, let me just go through.
00:48:46.000 On my actual phone, I'll just name a few apps which I think are a huge part of the privacy future.
00:48:53.000 Mines is a part of a bigger network.
00:48:55.000 The ultimate place where things are going, there's not going to be some new replacement for Google that's centralized.
00:49:02.000 It's going to be protocols that apps are all interoperating on.
00:49:06.000 So like, Briar is this amazing app that's currently going viral in Ukraine.
00:49:10.000 And Julian Assange actually posted about this app from prison.
00:49:14.000 He was able to communicate to his people.
00:49:17.000 So Briar is fully decentralized.
00:49:19.000 It runs over Tor.
00:49:20.000 And it can even run offline.
00:49:23.000 So you can, we could chat over Bluetooth.
00:49:25.000 I could be in a burning building and you're across the street in Ukraine.
00:49:28.000 We're getting bombed.
00:49:29.000 Internet is down.
00:49:30.000 And we're chatting.
00:49:32.000 Like, unbelievable mesh networking technology.
00:49:35.000 Is Briar like B-R-I-A-R? Yeah, so it's been a long time coming for them.
00:49:41.000 We're looking at integrating with the Bramble protocol, which is kind of the base protocol of Briar.
00:49:47.000 But, you know, there are a handful of fully decentralized options, also Secure Scuttlebutt and some others.
00:49:58.000 It's really cool.
00:49:59.000 I recommend checking it out.
00:50:01.000 And I think that off-grid technology that's not reliant on internet service providers is just, I mean, that's crazy.
00:50:11.000 The fact that that's even possible to chat with no internet?
00:50:14.000 That is crazy.
00:50:15.000 And sometimes when you go to a different country with your phone, you have to be compliant with that country's internet laws.
00:50:24.000 I mean, they can get into your phone where maybe the U.S. can't.
00:50:27.000 Right.
00:50:28.000 Do you think, like, in terms of privacy, would you recommend Google or Apple?
00:50:33.000 That question is not a question.
00:50:36.000 Really?
00:50:37.000 No.
00:50:37.000 But doesn't Apple at least give you the option to block advertisers from being accessed your information, block cross-platform or cross-application sharing of data?
00:50:53.000 Yes.
00:50:53.000 They've been locking down their app store, which has taken billions of dollars away from Google and Facebook advertising because they don't allow apps to do what they used to be able to do.
00:51:01.000 Right.
00:51:01.000 Isn't that good?
00:51:02.000 That is good.
00:51:03.000 That is good.
00:51:04.000 So Apple would be a better choice?
00:51:05.000 I don't know.
00:51:07.000 I think that, yes, that is a good thing in sort of cost-benefit.
00:51:11.000 But Apple is the mother of closed systems.
00:51:15.000 I mean, Steve Jobs literally said...
00:51:18.000 Proprietary.
00:51:18.000 It's like, we have a closed, walled garden.
00:51:22.000 Yes.
00:51:22.000 And that was his whole thing.
00:51:24.000 Like, we do not want anyone seeing what we're doing.
00:51:26.000 Hyper-competitive.
00:51:28.000 Apple does very, you know, relatively little open source compared to a lot of other companies.
00:51:33.000 Well, I remember the days of clones, where you could buy a fake Apple machine that runs Mac OS, and they shut them all down.
00:51:41.000 Because you could buy, like, a...
00:51:43.000 Bomber machine that has crazy power and gigantic hard drives and multiple hard drives, way more potent than anything that Apple was selling in the 90s.
00:51:54.000 And they banned all that stuff.
00:51:55.000 Is that for gaming and stuff?
00:51:57.000 Yeah, for gaming and just for people who do video editing and just people that wanted some crazy, ultra-hyped-up machine, and it would still run the iOS.
00:52:07.000 And this was the early iOS.
00:52:10.000 This was before OS X 10, which was the Unix-based operating system.
00:52:15.000 That was back when Apple's operating system was a little janky.
00:52:19.000 It was kind of sketchy.
00:52:21.000 Crash a lot.
00:52:22.000 No preemptive multitasking.
00:52:25.000 It was like no memory protection.
00:52:27.000 It would crash.
00:52:28.000 People were really devoted to it, but that shit would crash a lot until OS X came along.
00:52:34.000 Do you feel like you're at all willing to sacrifice any convenience in your technology?
00:52:40.000 Yes!
00:52:40.000 Yeah, I'm willing to sacrifice some.
00:52:43.000 What would be something that you would be willing to...
00:52:46.000 That's a good question.
00:52:46.000 I ask myself that all the time.
00:52:48.000 Because I'm fucking sure the government's paying attention to my phone.
00:52:52.000 So it's like, what am I willing to sacrifice?
00:52:56.000 Right now, I treat all interactions as if the government's watching.
00:53:00.000 That's what I do.
00:53:03.000 Yeah, there's an Android app store called F-Droid, which is like a non-Google Play app store.
00:53:10.000 So you can actually get apps off of Google Play.
00:53:13.000 On iOS, you can only get apps on the app store.
00:53:16.000 But how do you know if you go to this F-Store?
00:53:19.000 Is that what it's called?
00:53:20.000 F-Droid.
00:53:21.000 F-Droid.
00:53:21.000 How do you know if you go to this F-Droid whether or not this is spyware?
00:53:26.000 Is it vetted?
00:53:27.000 Well, there's spyware, 90% of the apps on Google Play are spyware.
00:53:32.000 Really?
00:53:33.000 Yes!
00:53:34.000 I mean, every app you install is infecting your system, most of them, because most apps are proprietary.
00:53:40.000 But Google is worse for that.
00:53:42.000 I mean, every app is different.
00:53:44.000 Every app has different permissions that they're giving and, you know, different security implications.
00:53:50.000 So I don't think that there's...
00:53:52.000 I think that the people at F-Droid and in the open source community as a general trend just care about these things more.
00:53:58.000 So they're not gonna, you know...
00:54:00.000 But there still can be malicious stuff in the open source realm.
00:54:03.000 But, you know, you gotta kind of understand...
00:54:09.000 That's a lot of work.
00:54:10.000 It is a lot of work.
00:54:11.000 But do you research the food that you eat?
00:54:14.000 Well, it's pretty simple.
00:54:15.000 Is it?
00:54:16.000 Yeah.
00:54:17.000 It's a pretty big education.
00:54:18.000 I think there's a lot of people who don't know.
00:54:21.000 Yes, but I've already had that education.
00:54:23.000 Right.
00:54:24.000 It's the same way with tech.
00:54:25.000 Yeah, I'm sure.
00:54:26.000 There's a learning curve.
00:54:28.000 But it seems like people are making new things.
00:54:30.000 Like, they're not really making new food.
00:54:33.000 Real food.
00:54:34.000 Like, real food has kind of been established.
00:54:36.000 Impossible burgers!
00:54:36.000 Yeah, that's not food.
00:54:38.000 That shit's terrible for you.
00:54:39.000 Have you had one?
00:54:39.000 Daryl, have you had one?
00:54:40.000 I have not.
00:54:41.000 I've seen them, but I've not had one.
00:54:43.000 Have you?
00:54:43.000 No.
00:54:44.000 Really?
00:54:44.000 Yeah, no, actually, I have to lie.
00:54:46.000 I did.
00:54:47.000 It's a lie, rather.
00:54:48.000 I did.
00:54:49.000 We did a show once at Stubbs, and my friend C.K. brought a bunch of burgers from a bunch of different places, and some of them were plant-based, so I took a bite.
00:54:57.000 It's just like a bland burger.
00:55:00.000 I actually switched from vegan.
00:55:02.000 To what?
00:55:03.000 I was vegan for like four years.
00:55:04.000 Switched back.
00:55:06.000 Switched back.
00:55:08.000 Yeah, switched back.
00:55:09.000 I was a meat eater, yeah.
00:55:10.000 Well, my wife actually has an autoimmune issue.
00:55:17.000 Not to overshare.
00:55:19.000 So she and we were vegan together.
00:55:21.000 Have you heard of Weston Price?
00:55:23.000 No.
00:55:23.000 He's a really famous nutritionist and...
00:55:27.000 Has this diet, like, heavy into organ meats and...
00:55:31.000 Nose to tail, that kind of thing.
00:55:33.000 Yeah, fermented foods and probiotics and stuff.
00:55:36.000 And so she was being told by her doctor that you have to go on this drug called Remicade every six weeks IV for the rest of your life.
00:55:46.000 What is it for?
00:55:47.000 It's for Crohn's.
00:55:48.000 Okay.
00:55:49.000 And she was like, what?
00:55:54.000 Life?
00:55:55.000 Like, every six weeks?
00:55:56.000 You're kidding me.
00:55:57.000 And so she was just like, no.
00:56:00.000 Okay, so she switched her diet.
00:56:01.000 So she switched her diet and is in remission.
00:56:04.000 Really?
00:56:05.000 And her diet consists of what now?
00:56:08.000 It's pretty much, I mean, if you look up Weston Price, but, you know, meats, fermented foods.
00:56:15.000 Just avoiding bread, sugar.
00:56:17.000 Salmon.
00:56:18.000 Yeah, yeah.
00:56:19.000 And she, so there's been studies done on cabbage juice.
00:56:23.000 This is a big thing for people with ulcers.
00:56:25.000 There have actually been studies that hardcore cabbage juice for like six weeks.
00:56:29.000 Can reverse ulcers.
00:56:31.000 And there have been studies on this.
00:56:32.000 And her regular gastro doctor didn't even know about that.
00:56:36.000 In that, she credits a huge transformation from the cabbage juice regimen.
00:56:41.000 Anyway, not to go off on a tangent.
00:56:43.000 We are on a tangent.
00:56:44.000 But your own personal experience, what was the difference between going from vegan to eating meat again?
00:56:48.000 I mean, I respect vegans.
00:56:51.000 I really do.
00:56:51.000 Especially the ones who aren't annoying.
00:56:55.000 There's like five of them.
00:56:57.000 There are five of them.
00:56:59.000 There's this guy, Ed, Earthling Ed, who's very honest and not preachy and has good information.
00:57:07.000 But anyway, I feel better.
00:57:10.000 I love eggs and meat and all that stuff.
00:57:13.000 And I do feel like...
00:57:27.000 Because I wanted to, you know, I didn't need that much of a reason, you know, doing it with my family.
00:57:34.000 And also...
00:57:36.000 It's just, you know, I'm not ideological about stuff.
00:57:41.000 I don't want to get stuck in ideology about food or whatever.
00:57:45.000 This is why I wanted you to talk about this, because this is exactly the kind of conversation that some people would like to suppress.
00:57:51.000 Exactly.
00:57:51.000 Because there are people that say that eating meat is bad for the environment, and I've had a bunch of people on to try to discuss that.
00:57:58.000 Pro and con, the latest is, what is it, Diane's last name is Rob Wolf and...
00:58:05.000 Diana Rogers.
00:58:06.000 They wrote a book called Sacred Cow.
00:58:08.000 We were talking about regenerative farming with them.
00:58:10.000 But there are people that think that those conversations should be suppressed.
00:58:15.000 And that when you have these kind of conversations, they should be flagged, you should be shadow banned.
00:58:20.000 There's a lot of people that promote the carnivore diet on Instagram that find themselves shadow banned.
00:58:25.000 And they have real issues.
00:58:26.000 Paul Saladino, carnivore MD. I think they took his account down.
00:58:31.000 As misinfo?
00:58:32.000 I don't know what the fucking excuse was.
00:58:33.000 I think some wacky vegan activist who works for the company can just decide that they're gonna take your account down.
00:58:40.000 I think there's a certain amount of control that the people that work there have, where it's very subjective.
00:58:46.000 So imagine, so Facebook spends tens of billions on moderation.
00:58:53.000 Or they have.
00:58:54.000 And so our vision, imagine if rather than tens of thousands of censors who are just going like, down, down, down, take it down, hate speech, misinformation, conspiracy theory.
00:59:07.000 What if you had tens of thousands of mental health professionals and positive intervention people and just like people engaging in dialogue who can...
00:59:17.000 Provide mental health resources to users who need it to share information.
00:59:21.000 I'm not saying you need no moderation.
00:59:23.000 You definitely do need a certain level.
00:59:25.000 But that's so much money and human energy.
00:59:29.000 I mean, you've seen the PTSD studies of these content moderators at Facebook.
00:59:32.000 These people get depressed.
00:59:34.000 They're suicidal.
00:59:37.000 They're just seeing Al-Qaeda videos all day.
00:59:39.000 Yeah, they're just watching crazy stuff, and that's a real thing that is unavoidable to a certain degree.
00:59:45.000 But, I mean, to bring in experts in dialogue to engage...
00:59:51.000 Imagine if Facebook spent billions of dollars on that.
00:59:55.000 Mental health resources for the community.
00:59:57.000 Would that be effective?
00:59:59.000 Well, yeah, because, I mean, look at it this way.
01:00:01.000 Say, 25, 30 years ago, insurance companies were not paying for acupuncture.
01:00:07.000 Oh, that's nonsense.
01:00:10.000 It's, you know, what do you call it?
01:00:11.000 A placebo or something.
01:00:12.000 Right.
01:00:13.000 Now they do.
01:00:14.000 Now they see value in it.
01:00:16.000 Well, Chinese people have been using that for 2,000 years.
01:00:18.000 Would they still be using it 2,000 years later if it wasn't working?
01:00:21.000 So now we're accepting, you know, some Eastern culture.
01:00:25.000 Now we're, you know, when our doctor does not give us what we...
01:00:28.000 Hope will cure us for our cancer, our diabetes, or whatever.
01:00:32.000 We go the holistic route, and we've found some pretty amazing results.
01:00:37.000 And that's what, you know, Minds is doing, the holistic approach, by giving everybody a platform to share their information, like you just shared about the cabbage juice.
01:00:49.000 You know, somebody hears this podcast and goes out and tries cabbage juice, and it clears up their wife's ailment or something like that.
01:00:56.000 And this is a good subject to talk about now because we just got through the pandemic and that was one of the things that was suppressed was information about methods of treating COVID. I mean, it was a giant issue where if you talked about whether it's hydroxychloroquine or ivermectin or whatever you would talk about,
01:01:15.000 even vitamins, we're talking about like the difference between the COVID results of people that were vitamin D insufficient versus people that had sufficient levels.
01:01:23.000 It's a giant difference.
01:01:24.000 But if you talked about that, you would get in trouble for disinformation or misinformation, and you would get either shadow banned or outrightly banned.
01:01:32.000 I mean, there were people that were banned from social networks for suggesting that people who are vaccinated can still spread COVID. That turns out to be an absolute fact now.
01:01:42.000 But if you said that eight months ago, nine months ago, instead of Having this conversation and having medical experts debate it and people that understand it and don't understand it, so ask questions and people who are following the standard narrative,
01:01:58.000 they express themselves, and then people that have alternative ideas express themselves, and we find out what's right and what's wrong.
01:02:04.000 Somebody expressed that it could be treated with bleach, right?
01:02:08.000 Wasn't that Trump?
01:02:09.000 Yeah, like an infusion of bleach.
01:02:13.000 Yeah, I mean, there should be warnings.
01:02:15.000 But imagine if rather than a fact check warning like, you know, these three think tanks said that this is false.
01:02:22.000 What if you could actually see a visualization of the debate that showed both sides and gave you like a probability score?
01:02:29.000 Or something on the piece of content, as opposed to saying black or white.
01:02:33.000 Well, who checks the fact checkers?
01:02:34.000 That's the problem.
01:02:36.000 There's a lot of fact checkers that are just full of shit.
01:02:39.000 There's a lot of things that are mostly true or mostly false.
01:02:42.000 And you look into it and you're like, fuck you.
01:02:44.000 This is not mostly false.
01:02:46.000 It's either false or true.
01:02:48.000 It can't be mostly.
01:02:49.000 It's like, you know, somebody's being sort of pregnant.
01:02:53.000 I mean, either you're pregnant or you're not, you know?
01:02:54.000 But if there's multiple statements about an issue, and some of them are correct and some of them are not, then it would be, like, mostly true.
01:03:00.000 Well, take a piece of COVID, you know, content like you were talking about, and, you know, there's going to be studies on one side and another.
01:03:06.000 What do you do on minds for that stuff?
01:03:08.000 Well, we're building out a sort of citation tool to kind of show the citations on both sides of various arguments and, you know, have more crowdsourced This really gets into the realm of decentralized identity and where we're moving in terms of reputation and credibility on the internet.
01:03:27.000 And right now, you've got all these different logins, what we were talking about, where things are going with crypto and with the web standards.
01:03:36.000 Really, we're moving towards a place where you have these credentials associated with your core identity, which can be generated from like a crypto wallet or something like that.
01:03:46.000 And you'll have all these badges that you're earning everywhere you go.
01:03:49.000 And you can decide to disclose those or not disclose those, like NFTs.
01:03:54.000 I mean, right now...
01:03:55.000 I'm confused.
01:03:56.000 What are you earning badges for?
01:03:58.000 Can we see the interface?
01:03:59.000 Would you pull up mine so we can see the interface?
01:04:02.000 Yeah, so, ultimately, credibility on the internet.
01:04:07.000 It's like, how do you measure that?
01:04:09.000 How do you trust users?
01:04:10.000 So it's like, if I say, oh, Bill is a very good guy, he says a lot of true things, he's very reasonable, so you get a badge for that?
01:04:19.000 There could be any infinite number of, you know, badges that you could potentially earn, but like you could be trusted by, say someone in martial arts trusts you and they give you a signal of trust,
01:04:34.000 then that would add to your credibility in martial arts in your decentralized identity on the internet, which would be interoperable between social networks.
01:04:44.000 So that there's sort of this web of- Oh look, I got a page.
01:04:48.000 You got a page.
01:04:49.000 Who put my picture up there?
01:04:50.000 You took over the account.
01:04:52.000 You asked me for the creds.
01:04:54.000 I know.
01:04:54.000 I mean, I'm just saying, who took my picture up there?
01:04:56.000 Some fan or something.
01:04:57.000 I don't know who put my picture up there.
01:04:58.000 Someone fan can just put my picture up there?
01:05:00.000 I don't know.
01:05:01.000 People create fan pages.
01:05:02.000 Okay, so like in 2020, I posted something it says.
01:05:07.000 And it got 31,000 views.
01:05:09.000 Wow, look at that.
01:05:10.000 Huberman, that episode's down.
01:05:12.000 Okay, so what is making these things post?
01:05:16.000 I don't know.
01:05:18.000 Someone must have just posted them.
01:05:19.000 Well, how can they post it under my name?
01:05:20.000 This might have been just linked from YouTube, because these are just YouTube posts.
01:05:23.000 Right, but how were they linked in my name?
01:05:26.000 I didn't do that.
01:05:29.000 I don't know.
01:05:31.000 Maybe...
01:05:31.000 I have 118,000 subscribers?
01:05:33.000 Dude, you probably have a bunch of tokens too.
01:05:35.000 Oh, I have tokens.
01:05:36.000 So you should...
01:05:37.000 We'll figure it out.
01:05:38.000 We'll figure it out.
01:05:39.000 So this is two years ago.
01:05:41.000 This is August 13th.
01:05:43.000 So this is during the pandemic.
01:05:44.000 I 100% didn't post that.
01:05:47.000 Okay.
01:05:48.000 So someone is posting that in my name.
01:05:51.000 Did my account get hacked?
01:05:53.000 Imagine my account at mine's got hacked and some dude is just posting.
01:05:58.000 I mean, I sent you the password.
01:05:59.000 Did you change it?
01:06:00.000 I don't know.
01:06:01.000 We'll figure it out.
01:06:04.000 Okay, we'll figure it out.
01:06:05.000 But someone's posting as me and I have 118,000 subscribers.
01:06:09.000 I should probably get on that.
01:06:13.000 You own JoeRogan.com though, right?
01:06:15.000 I think it was connected.
01:06:17.000 It was pulling in your YouTube and something might have gotten...
01:06:21.000 We'll fix it.
01:06:22.000 Fuckery just doesn't seem good, Jamie.
01:06:26.000 There's nothing on it.
01:06:28.000 It's just YouTube.
01:06:28.000 It's just a link to your YouTube link.
01:06:30.000 This was your Twitter picture.
01:06:31.000 But some of them are missing.
01:06:33.000 Some of the YouTube videos are missing.
01:06:33.000 Well, that's because there's missing YouTube.
01:06:35.000 We don't have everything up on YouTube.
01:06:36.000 Right.
01:06:36.000 Oh, that's right.
01:06:37.000 When we changed over to Spotify, we removed some of the stuff.
01:06:41.000 Spotify only allows us to keep 100 full-length episodes on YouTube at a time.
01:06:47.000 Most of it has to be.
01:06:48.000 We're trying to channel.
01:06:49.000 Here, go to minds.com slash change.
01:06:58.000 So, this is the stuff that you guys are doing.
01:07:03.000 This is Change Minds.
01:07:04.000 Yeah, this is the link to the paper.
01:07:05.000 This is the censorship effect.
01:07:07.000 Yeah.
01:07:08.000 And so, when Vox, who are very strongly left-leaning, when they have a piece that they write saying that There are harmful effects of censorship that actually pushes people towards more radical ideas.
01:07:24.000 What are they suggesting?
01:07:26.000 Are they suggesting that places like social media, sites like Twitter, back off of censorship and maybe choose an alternative?
01:07:34.000 They're not going that far, but I think it's a step in the right direction.
01:07:37.000 They also talk about the reach.
01:07:40.000 A lot of their question is, what is the reach of content?
01:07:44.000 Alex Jones, for instance.
01:07:46.000 We did an empirical analysis of his reach after he got banned, and it actually went up.
01:07:54.000 Globally.
01:07:54.000 So in terms of all of the views on...
01:07:59.000 Right.
01:07:59.000 Let me stop you right there.
01:08:00.000 Wouldn't it keep going up if he wasn't banned?
01:08:03.000 Because everybody goes up.
01:08:05.000 My shit goes up every month.
01:08:07.000 So when you say his went up, does that mean it went up at a proportionate level, the same as it would if he stayed on Twitter?
01:08:13.000 Or did it go up just based on the baseline of when he got banned?
01:08:17.000 Right.
01:08:18.000 Yeah, I mean, that's sort of an impossible thing to know because you can't really know what the world would be.
01:08:23.000 But you could follow the trend line.
01:08:24.000 Yeah, you could track the trend.
01:08:25.000 I mean, I think that what Vox is saying is, it depends if, you know, Alex has a platform.
01:08:30.000 So he was going to grow huge kind of in either direction, I would imagine.
01:08:34.000 But small people, when they get banned, you know, that kind of gets buried.
01:08:39.000 You know, no one's complaining when some random person posting a COVID post gets banned from Twitter.
01:08:45.000 They're just lost.
01:08:47.000 There's millions of people who just get lost from that.
01:08:50.000 And so, you know, anyway, in the analysis we saw, the total views of Alex's content went up significantly.
01:08:59.000 But I think that it's, you know, it's called the Streisand effect.
01:09:02.000 But it's also, there's variation on that.
01:09:04.000 And I think it is definitely, CenturyStreet also works.
01:09:07.000 Like, in an isolated system.
01:09:10.000 So if you're on Google or Facebook or Twitter, like, yeah, you can silence certain words or topics.
01:09:16.000 But when you're thinking of the internet as a whole, then, you know, the total reach is not necessarily going down.
01:09:25.000 And we need to start thinking about the internet as a whole, not just isolated networks.
01:09:30.000 Like, you can't claim that censorship of COVID misinfo worked When you just banned it from Google and it just went up, like what about the global numbers?
01:09:41.000 That's what we need to be looking at.
01:09:43.000 So when you guys got together, how long have you guys been working together?
01:09:48.000 Four years?
01:09:50.000 Yeah.
01:09:51.000 And when I first joined on, he was just approaching two million members and now it's over five million.
01:10:01.000 So it is growing.
01:10:02.000 Oh, yeah.
01:10:02.000 Mine is growing.
01:10:03.000 It's like, well, hopefully you get a lot more after this one.
01:10:05.000 But it's the difference between that and Facebook.
01:10:09.000 Like, what is Facebook?
01:10:11.000 Oh, God.
01:10:12.000 Billions.
01:10:13.000 And Twitter.
01:10:14.000 Hundreds of millions.
01:10:15.000 If not billions.
01:10:17.000 Yeah, probably close.
01:10:18.000 I don't know.
01:10:19.000 So there's a giant difference in terms of the user experience.
01:10:23.000 Yeah, but here's the crazy thing.
01:10:25.000 You can actually get more reach on minds than Facebook or Twitter if you're a small creator.
01:10:32.000 Because small creators, like getting out of the void on social is so hard.
01:10:36.000 And we have this reward mechanism where you can earn tokens and boost your content.
01:10:41.000 And, you know, we also just wrote out this build your algorithm feature where you can actually opt in to see people who are different from you or similar for you.
01:10:51.000 Or you can opt in to increase the level of tolerance that you have to ideas that you disagree with.
01:10:56.000 How do you adjust that?
01:10:59.000 There's these toggles.
01:11:02.000 So, say if you're a vegan and you're starting to feel a little sick, maybe we should pay attention to some of these carnivore people.
01:11:10.000 You could let a little of that in?
01:11:11.000 Yes.
01:11:12.000 Yes, exactly.
01:11:13.000 Open up your recommendations to not just stuff that's going to bring you down your own echo chamber, but expand it.
01:11:20.000 Now, Darrell, I want to talk to you about your personal experience on Mines with what you do, what you're known for.
01:11:27.000 Have you had interactions with people on Mines that have been favorable, that you've kind of pushed people into a— Yeah, I've had a few, and I've had my share of detractors.
01:11:37.000 Some people think what I'm doing is totally wrong and don't get me or whatever.
01:11:41.000 But yeah, I've had interactions with some people.
01:11:44.000 When you say people have said it's totally wrong, what kind of criticisms do they have for that?
01:11:50.000 It depends upon where they're coming from.
01:11:52.000 Some people think it's not my job to teach white people how to treat us.
01:11:57.000 Us meaning black people.
01:11:59.000 Others think it's ridiculous to sit down with a white supremacist.
01:12:02.000 Why would you waste your time?
01:12:04.000 Those people can't change.
01:12:06.000 Do you point to your success ratio?
01:12:08.000 Because it's pretty amazing.
01:12:09.000 Oh, yeah.
01:12:09.000 I point to that.
01:12:10.000 But a lot of people, they don't see that because they would not tolerate the time to sit down and have somebody tell them some nonsense that Jews are the childs of the devil or some crazy things like that.
01:12:22.000 I will sit and listen to that and I will put up with it.
01:12:25.000 Because in order for me to speak my mind, I have to listen to somebody else's.
01:12:33.000 So they're not willing to put in that time.
01:12:36.000 I am.
01:12:38.000 How do you have the time to do this?
01:12:41.000 This is what I do.
01:12:43.000 In between my music gigs.
01:12:46.000 What kind of commitment are you talking about?
01:12:49.000 How much time do you spend doing this?
01:12:50.000 A lot.
01:12:51.000 It's my life now.
01:12:52.000 How many, like, email dialogues?
01:12:54.000 Oh, God.
01:12:55.000 I get emails all the time.
01:12:57.000 I get emails from people I don't even know.
01:12:58.000 I even get emails from people who've seen me on podcasts or on TV shows.
01:13:03.000 These are white supremacists, Klansmen, whatever, and say, you know, you made some sense in it.
01:13:07.000 Would you like my robe?
01:13:08.000 I've even gotten robes in the mail from people I don't even know.
01:13:13.000 Yes.
01:13:14.000 I think there's a lot of sad people that just need a group of people to belong to.
01:13:19.000 And they'll decide that what these people are saying makes sense because at least they'll be a part of it.
01:13:24.000 Let me explain something to you.
01:13:26.000 As you already know, one's perception is one's reality.
01:13:32.000 You cannot change anybody's reality.
01:13:35.000 If you try to change their reality, you're going to get pushback because they only know what they know.
01:13:39.000 Whether it's real or not, it's their reality.
01:13:42.000 So what you want to do is you want to offer them a better alternative perception.
01:13:48.000 And if they resonate with your perception, then they will change their own reality because their perception becomes their reality.
01:13:54.000 Just a quick example.
01:13:55.000 Let's say you got a seven or eight-year-old brother, right?
01:13:58.000 And he goes to a magic show with his buddies.
01:14:00.000 And he comes back and tells you, Joe, you know, this magician, he asked for a female volunteer and 50 women raised their hand.
01:14:06.000 He picked up this one, come up on stage.
01:14:08.000 He told her to climb into this long box and stick her feet out that hole and put her head out this hole.
01:14:13.000 Then he closed the lid, told her to wiggle her feet, and she kicked her legs, and he took a chainsaw and went and cut that box in half.
01:14:21.000 He cut that woman in half.
01:14:23.000 And you're like, it didn't really happen like that.
01:14:26.000 Yes, it did.
01:14:26.000 I was there.
01:14:27.000 You weren't even there.
01:14:28.000 I saw it with my own eyes.
01:14:29.000 You are challenging his reality.
01:14:31.000 He knows what he saw.
01:14:33.000 I think?
01:14:55.000 And out popped the woman full form, no blood.
01:14:57.000 He cut her in half and he put it back together.
01:14:59.000 And you're saying, eh, it was just an illusion.
01:15:01.000 No, it wasn't.
01:15:02.000 I saw it with my own eyes.
01:15:03.000 I was there.
01:15:04.000 You weren't even there.
01:15:04.000 So again, you're attacking his reality.
01:15:07.000 He's going to resist.
01:15:07.000 He's going to fight you.
01:15:08.000 All right?
01:15:09.000 So what you do is you offer him a better perception.
01:15:13.000 You say, hey, listen, I hear what you're saying.
01:15:15.000 But could it be possible that just maybe...
01:15:18.000 Out of those 50 women that raised their hands and he picked one, maybe she works for him.
01:15:24.000 Maybe he planted her in the audience.
01:15:26.000 She knows the trick.
01:15:27.000 She travels to every show around the country with him.
01:15:29.000 And when she gets in the box, there's a pair of mannequin legs laying on the floor of the box that are wearing the same stockings and same shoes that she has on.
01:15:38.000 She picks them up, shoves them out the hole.
01:15:41.000 When he says, move your feet, she shakes those things.
01:15:43.000 And then she brings her own legs up under her chest.
01:15:46.000 So her whole body is on that half of the box.
01:15:49.000 So the saw doesn't even touch her.
01:15:51.000 And obviously when he separates the two halves, the feet are over there.
01:15:55.000 Now she can't move them.
01:15:56.000 So he has to distract your attention by going over here.
01:15:59.000 So you're not looking at those feet.
01:16:00.000 And he's talking to the head and she's talking back.
01:16:03.000 Of course, when he brings them back together, she pulls the dummy legs, leaves them on the floor of the box.
01:16:08.000 She climbs out.
01:16:09.000 And then your brother says...
01:16:11.000 Hmm.
01:16:12.000 You know, I guess that would be the only way that would work.
01:16:14.000 You've offered him a better perception, and that perception then becomes his reality.
01:16:20.000 So don't attack somebody's reality, regardless of what it is, even if you know it to be false.
01:16:25.000 Give them a better perception and allow them to resonate with it, because it's always better when somebody comes to the conclusion, I've been wrong.
01:16:32.000 Maybe this is something I need to think about.
01:16:34.000 Yeah, this will work.
01:16:35.000 It's a perfect example of not silencing people's ideas but giving them better ideas.
01:16:41.000 And this is what the answer to censorship has been.
01:16:44.000 Exactly.
01:16:45.000 And, you know, so Daryl always talks about how much he listens when he starts a dialogue and doesn't even try to, you know, push ideas at the people that he's engaging with, different extremists or whatnot.
01:16:56.000 Would you agree with that statement?
01:16:58.000 Absolutely.
01:16:58.000 And let me just give you an example of that.
01:17:01.000 I'm interviewing a Klan leader, white supremacist, right?
01:17:04.000 And I ask, you know, how can you hate me?
01:17:06.000 You don't even know me.
01:17:07.000 You know, all you see is this.
01:17:09.000 You come in my room five minutes ago and you've already determined, you know, whatever you determine.
01:17:15.000 Well, Mr. Davis, you know, black people are prone to crime.
01:17:17.000 And that is evidenced by the fact that there are more blacks in prison than white people.
01:17:21.000 Now, I'm just sitting here listening to this guy.
01:17:23.000 He's calling me a criminal.
01:17:25.000 But he's right.
01:17:26.000 He's 100% right.
01:17:28.000 The data and the statistics show that there are more blacks in prison than white people.
01:17:33.000 So that feeds what he already thinks he knows, the data, right?
01:17:38.000 But he does not go to find out why does that data show that.
01:17:43.000 He doesn't realize there may be an imbalance in our judicial system that sends black people to prison for longer periods of time than white people who've committed the same crime.
01:17:52.000 So I just listen to him.
01:17:53.000 Because when he walks in that room and he sees me, I'm the enemy.
01:17:56.000 His wall goes up.
01:17:58.000 His ears are like this.
01:17:59.000 He's ready to defend whatever his stance is.
01:18:02.000 So I'm just listening.
01:18:04.000 And then he goes on to say, you know, black people are inherently lazy.
01:18:08.000 They always have their hand out for a freebie.
01:18:10.000 They're always trying to scam the government welfare programs and all that kind of stuff.
01:18:14.000 So now he's called me a criminal.
01:18:17.000 Now he's calling me lazy.
01:18:19.000 And I'm just sitting here listening.
01:18:20.000 I'm not pushing back.
01:18:21.000 And then he says, and black people are born with smaller brains.
01:18:26.000 And the larger the brain, the more capacity for intelligence.
01:18:29.000 The smaller the brain, the lower the IQ. So now I'm being called stupid.
01:18:33.000 Now, he says that this is evidenced by the fact that every year the data shows that black high school students consistently score lower on their SATs than on white kids do.
01:18:45.000 Again, he's 100% correct.
01:18:47.000 That does show that.
01:18:49.000 But he doesn't realize why.
01:18:51.000 Where do most black kids in this country go to school?
01:18:54.000 In the inner city.
01:18:55.000 Where do most white kids go to school?
01:18:57.000 In the suburbs.
01:18:58.000 It is a fact.
01:18:59.000 Suburban schools are better funded.
01:19:01.000 They have better facilities, better teachers, etc.
01:19:03.000 I will guarantee you, white kids who go to school in the inner city can score just as low as those black kids, if not some lower.
01:19:11.000 Black kids who go to school in the suburbs can score just as high as As the white kids, if not higher.
01:19:16.000 It has absolutely nothing to do with the color of the student's skin or the size of the student's brain.
01:19:24.000 But it has everything to do with the educational system in which that child is enrolled.
01:19:28.000 But of course, he won't go to research that because the data already supports what he already believes, that I'm inferior.
01:19:34.000 So now he's called me all these things.
01:19:37.000 I've already done my research on him.
01:19:39.000 I know this guy sitting across from me just barely made it out of high school.
01:19:42.000 I have a college degree.
01:19:44.000 So do I throw that in his face?
01:19:45.000 No.
01:19:46.000 But because I sat there and listened to him, that wall is coming down.
01:19:51.000 Because you cannot impart information to somebody when the wall is up.
01:19:55.000 It's like hitting a brick wall.
01:19:56.000 You want that wall to come down and then the ears open up.
01:19:58.000 So now he's exhausted all his vitriol.
01:20:01.000 And now he's wondering, like, how come this black person isn't pushing up against me like most of them do?
01:20:05.000 And he's curious as to what I think about what he just said.
01:20:09.000 So now the wall is down and he feels compelled to reciprocate because I sat there and listened to him insult me.
01:20:14.000 So now it's my turn.
01:20:15.000 I could go on the offense and say, no, you are the one who's a criminal.
01:20:19.000 You're the one hanging black men from trees and dragging them behind pickup trucks and bombing their churches.
01:20:24.000 And I would be 100 percent correct because the Klan has over a hundred year history of doing that.
01:20:30.000 But if I did that, that wall would go right back up.
01:20:32.000 So I don't want that to happen.
01:20:33.000 I want to keep the wall down and let him hear what I'm saying.
01:20:36.000 So rather than go on the offense, I go on the defense.
01:20:40.000 And I say, listen, I hear what you're saying.
01:20:42.000 However, I don't have a criminal record.
01:20:45.000 And I'm as black as anybody you've ever seen.
01:20:47.000 So I don't have a criminal record.
01:20:50.000 I've never been on welfare.
01:20:52.000 As far as my brain size goes, I've never measured the size of my brain, but I'm sure it's the same size as anybody else's.
01:20:58.000 And as far as my SAT scores go, they got me into college.
01:21:02.000 Now, I already know that he doesn't have a college degree.
01:21:05.000 I do.
01:21:06.000 Does it make me a better person than him?
01:21:07.000 No.
01:21:08.000 But it gives me a better experience, right?
01:21:10.000 So I let him know this.
01:21:12.000 He goes home.
01:21:14.000 And he thinks, just like we all do at the end of the day, we reflect on what we did during the day.
01:21:17.000 He thinks, man, I just had a three-hour conversation with a black guy, you know, and we didn't come to blows.
01:21:22.000 And what that Daryl guy said, it makes sense.
01:21:25.000 Oh, but he's black.
01:21:27.000 But what he said was true.
01:21:28.000 But he's black.
01:21:29.000 So they're having a cognitive dissonance, right?
01:21:32.000 And they struggle with that for a while.
01:21:34.000 And then they have that dilemma.
01:21:36.000 I've got to make up my mind, what am I going to do?
01:21:38.000 So the dilemma is, do I disregard whatever color he is and believe the truth because I know it to be true and change my ideological direction?
01:21:48.000 Or do I consider the color of his skin and continue living a lie?
01:21:52.000 In most cases, people will follow the truth.
01:21:54.000 But then there will be those who don't want to give up the power or the notoriety or whatever, and they will follow the lie.
01:22:01.000 Well, the way you're doing it is brilliant because you're doing it so patiently and contrary to the way most people handle arguments.
01:22:08.000 Most people handle arguments by trying to shut down the other person's argument and shit all over them instead of trying to, what you're saying, offer an alternative perspective, which is really probably the only way to get people to think about things in a different light.
01:22:21.000 And, Joe, that comes from the fact that I've done a lot of travel.
01:22:25.000 I've been exposed to people from all over the world.
01:22:27.000 And we all got along.
01:22:28.000 We all got along.
01:22:29.000 We told a story on the podcast the first time you hear about not even understanding racism until you were a child because you grew up overseas.
01:22:37.000 Right.
01:22:37.000 Exactly.
01:22:37.000 So I saw that.
01:22:39.000 So I saw something that they have not seen.
01:22:41.000 Right.
01:22:42.000 And that's why I want to share that with them vicariously, to let them know.
01:22:46.000 Every white person in the world is not like every white person in this country.
01:22:51.000 Every black person in the world is not like every black person in this country.
01:22:54.000 You know, there are white people over in France, like in the 1940s and 50s, a lot of black Americans moved to France to live.
01:23:01.000 Some even gave up their U.S. citizenship because the French people were treating them as equals.
01:23:06.000 They didn't see color, you know?
01:23:08.000 And those French people were a lot more white than the white people here in this country who might be mixed with something else.
01:23:13.000 So, you know, people need to see.
01:23:15.000 In fact, my favorite quote of all time is by Mark Twain, or otherwise known as Samuel Clemens.
01:23:20.000 It's called the travel quote.
01:23:22.000 And Mark Twain said, quote-unquote, travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our people need it sorely on these accounts.
01:23:31.000 Broad, wholesome, charitable views of men and things cannot be acquired by vegetating in one little corner of the earth all one's lifetimes.
01:23:38.000 That guy was so good.
01:23:39.000 Wasn't he?
01:23:40.000 He had so many great quotes.
01:23:41.000 Exactly.
01:23:42.000 Isolation.
01:23:43.000 And so Sam Harris actually did a study that we talk about in the paper.
01:23:47.000 He did a neuroimaging study of people being exposed to political beliefs different from their own and actually looked at people's brains when they were going through this experience.
01:23:59.000 And they actually talked about this thing called the backfire effect, which is sort of what you're talking about when the wall's up.
01:24:07.000 And so they sort of detected that, interestingly.
01:24:10.000 And I forget the exact name of the study, but it's in the footnotes.
01:24:14.000 So I think the patience is it.
01:24:19.000 That it's long-term.
01:24:21.000 You're not changing someone's mind, like, in five minutes of, you know...
01:24:25.000 Chattering in comment sections or, you know, yelling at someone at the dinner table that you barely know.
01:24:32.000 Like, Daryl knows how to create long-term relationships and not be, like, thirsty for them to change their mind.
01:24:39.000 Like, it's just by, like, look, we're here.
01:24:41.000 We're hanging out.
01:24:42.000 Whether it's a network or, you know, offline or online network, it doesn't really matter.
01:24:47.000 And so I think the backfire effect that Sam found and that we're sort of talking about with walls going up is very real.
01:24:54.000 And that's why...
01:24:56.000 It has to be long term.
01:24:58.000 You know, Darrell, I'm just thinking while I'm listening here, like these conversations that you've had with these white supremacists and neo-Nazis, how amazing would it be if that was a podcast?
01:25:10.000 It is.
01:25:11.000 No, but I'm saying, if you sat down with those people from the beginning, from first meeting them, and see that conversation play out, that would be very relatable.
01:25:21.000 I've got some of that.
01:25:23.000 Do you?
01:25:23.000 Where I've sat down with some of these people while they were still in, and now I'm sitting down with them now that they're out.
01:25:28.000 Some of them even come on my lecture tours with me and stand on stage with me and speak out against their former organization.
01:25:34.000 Do you have videos of these conversations?
01:25:36.000 Yeah, some of them, yeah.
01:25:37.000 God, are they online?
01:25:39.000 Some of them I think are, but if not, I can send you some.
01:25:41.000 I think those videos would be a great tool for someone that's maybe trapped, but at least partially open-minded, where they have this view of things, like, maybe I'm incorrect about this.
01:25:53.000 Maybe I need to re-evaluate.
01:25:55.000 But as a podcast, that would be brilliant.
01:25:59.000 That's a great idea to have someone from the jump walk in a KKK member and have this conversation where they sit down with you over hours and hours and present all these articles about crime, brain size, all this shit, and have you just tell them your perspective and see the wheels start turning.
01:26:21.000 Because I think sometimes A lot of these people they're only interacting with people that think like them.
01:26:28.000 Right, exactly.
01:26:29.000 Now, I'll give you a crazy-ass example of something, right?
01:26:34.000 Unbelievable, right?
01:26:35.000 So this exalted cyclops, which means a district leader in the clans.
01:26:39.000 Okay.
01:26:40.000 Okay.
01:26:41.000 So he's in my car with me, right?
01:26:43.000 Dragons, wizards, exalted cyclops.
01:26:45.000 That's hilarious.
01:26:46.000 So he's in passenger seat.
01:26:47.000 I'm driving.
01:26:48.000 And we got on the topic of crime and stuff.
01:26:51.000 And he was talking about, you know, black-on-black crime and how violent we were and all that kind of stuff.
01:26:58.000 And he said, you know, black people have a gene within them that makes them violent.
01:27:03.000 Now, I'm driving.
01:27:04.000 He's over here.
01:27:05.000 And I said, you know, what are you talking about?
01:27:06.000 And he says, well, look at all the carjackings and drive-bys in the southeast.
01:27:11.000 He was referring to southeast Washington, D.C., which is predominantly black.
01:27:14.000 There's some whites that live there, but it's predominantly black, very high crime-ridden.
01:27:18.000 I said, okay.
01:27:19.000 I said, but, you know, you're not considering the demographics.
01:27:22.000 That's what lives there.
01:27:23.000 I said, what about all the crime in Bangor, Maine?
01:27:26.000 White people, that's what lives there, right?
01:27:28.000 I said, you know, he goes, no, no, no, that has nothing to do with it.
01:27:30.000 You know, you all are born with that gene.
01:27:32.000 And I said, look at me.
01:27:34.000 I said, I have never...
01:27:36.000 I'm as black as anybody you know.
01:27:38.000 I have never committed a drive-by or a carjacking.
01:27:41.000 How do you explain that?
01:27:43.000 This man didn't even think about it.
01:27:45.000 He didn't hesitate one second.
01:27:46.000 He goes, your gene is latent.
01:27:48.000 It hasn't come out yet.
01:27:49.000 It almost came out then, but, you know...
01:27:51.000 But, I mean, he had an answer for everything.
01:27:55.000 And I was, you know, stupefied.
01:27:56.000 Like, he's over here all smug.
01:27:58.000 You know, you got nothing to say.
01:27:59.000 And so I thought about it.
01:28:01.000 Well, if I gave him some, you know...
01:28:05.000 Ph.D. knowledge or whatever.
01:28:07.000 It wouldn't faze him.
01:28:08.000 So I had to go to where he was.
01:28:11.000 I said, well, you know, we all know that every white person has a gene in them that can make them a serial killer.
01:28:18.000 And he says, how do you figure?
01:28:19.000 I said, well, name me three black serial killers.
01:28:22.000 He couldn't do it.
01:28:23.000 I said, I'm going to name you one.
01:28:25.000 I named one for him.
01:28:26.000 I said, here's one, just give me two.
01:28:28.000 He couldn't do it.
01:28:30.000 I said, Charles Manson, Jeffrey Dahmer, Henry Lee Lucas, John Wayne Gacy, Ted Bundy, Albert DeSalvo, the Boston Strangler, David Berkowitz, Son of Sam, on and on.
01:28:40.000 I said, they're all white.
01:28:41.000 I said, son, you are a serial killer.
01:28:43.000 He goes, He goes, Daryl, I never killed anybody.
01:28:46.000 I said, your gene is waiting.
01:28:47.000 Has it come out yet?
01:28:49.000 He goes, well, that's stupid.
01:28:51.000 And I said, well, duh.
01:28:52.000 I said, yeah, it is stupid for me to say that about you.
01:28:55.000 But it's no more stupid for me to say that about you than what you said about me.
01:28:58.000 And he got very, very quiet.
01:29:01.000 You see his wheels were going bzzz.
01:29:03.000 And then he changed the subject.
01:29:04.000 And within five months, he quit the Klan.
01:29:07.000 And his robe was the first robe I got.
01:29:09.000 Yeah.
01:29:10.000 Based on that stupid conversation.
01:29:11.000 I remember that conversation you were laying on the podcast.
01:29:14.000 Yeah.
01:29:14.000 I had a conversation on a podcast many years ago where a guy actually did bring up that gene thing with black people.
01:29:21.000 Oh, it's common.
01:29:22.000 Yeah.
01:29:22.000 And he said it, and I didn't know the guy before I had him on, and while I was having him on, I was realizing a lot of the shit that this guy's saying...
01:29:31.000 I probably shouldn't have had him on.
01:29:33.000 No, you should have!
01:29:35.000 Yeah, but back in those days, I would have people on, I would just read something, they'd say, well, this is probably a conversation that's controversial, I'll talk to this guy.
01:29:44.000 But some of the things he was saying, that was one of them, was that black people had this gene for violence.
01:29:48.000 And I go, well, how the fuck do you explain war?
01:29:51.000 My take was like most wars started by white people.
01:29:54.000 Like if you looked at the amount of war that goes on in the world, worldwide, like how much of it is instigated and initiated by white people and is there a thing more violent than war?
01:30:06.000 Nothing.
01:30:07.000 It's like literally you're telling people that don't even know people that it's their obligation to kill someone based on what land they're from or what part of the world.
01:30:17.000 That's the most violent shit we know, and it's all by white people.
01:30:20.000 Black-on-black crime is a myth.
01:30:22.000 No such thing.
01:30:23.000 It's a crime of proximity, okay?
01:30:26.000 Because they need something immediately.
01:30:27.000 They're not going to go all the way across town to the white neighborhood and attack some white guy.
01:30:31.000 Somebody right here might have it.
01:30:33.000 Go into his house, break it, take his stuff, beat him up, whatever, all right?
01:30:36.000 So we hear about black-on-black crime.
01:30:38.000 So do we call Russia invading Ukraine and killing all these people white-on-white crime?
01:30:43.000 That's exactly what it is.
01:30:44.000 Yes.
01:30:45.000 Yeah, I mean, and some people are actually using that as an argument for how racist the way we look at war is.
01:30:52.000 Because during the time where all this is happening in Ukraine, how many people are bombed in Yemen?
01:30:58.000 How many people are bombed in all...
01:31:00.000 All sorts of parts of the world where there are these military actions that we're ignoring.
01:31:05.000 There's actually a chart that someone put up.
01:31:07.000 It's like a graphic that shows the bombings and the people that died in Ukraine versus the people that are dying right now simultaneously due to US drone strikes and all sorts of other shit that's happening all over the world at the same time.
01:31:24.000 We're concentrating on this one thing, and it's in the news, and that's part of the reason why people are concentrating on it so much.
01:31:30.000 Well, I learned a long time ago when I was living overseas, if you want to learn about your own country, read a foreign newspaper.
01:31:38.000 Like the Herald Tribune, the French paper, their perspective on what's going on in the U.S. Because we don't tell our own people just the same way.
01:31:46.000 Russians don't tell their own people everything.
01:31:49.000 I'm interested, you know, that you had that feeling that, you know, maybe you shouldn't have had that person on.
01:31:55.000 This was early in the podcast.
01:31:56.000 I know, I know, but I'm just, I'm saying that I think that, because I'm sure that was a, I don't know who you're talking about, but I'm sure that was a productive conversation in certain ways, and I feel like there's this chilling effect that is happening.
01:32:09.000 Where we're afraid to have a conversation with a murderer.
01:32:16.000 Or maybe not a murderer, but that's kind of the funny thing.
01:32:19.000 You could interview probably a serial killer on this show, and that would be fascinating.
01:32:23.000 And no one would be like...
01:32:25.000 Oh, dude, Joe's, like, gonna become a serial killer.
01:32:28.000 He just had a serial killer on his show.
01:32:30.000 And, like, people are obsessed with true crime and, you know, obsessed with interviews with some of the worst humans that have ever existed.
01:32:39.000 And those are considered to be extremely valuable interviews.
01:32:43.000 And I think that you should...
01:32:45.000 I hope that you, you know...
01:32:49.000 Own your ability to do that in a way where people aren't assuming that you think or you endorse the views of people that you're talking to.
01:33:00.000 That is a sickness.
01:33:01.000 This is an argument that's always going to take place where you're platforming those people.
01:33:05.000 This is the dialogue that the left likes to use today, that you're platforming these people.
01:33:11.000 That's what I hear if we're sitting down with those people.
01:33:13.000 It's so dumb.
01:33:14.000 It's such a dumb argument, especially in your case.
01:33:17.000 Look at the results.
01:33:18.000 What other human being has a documented result of literally hundreds of KKK and neo-Nazi people abandoning their ideology because they've had a conversation with you and literally had a change of heart, an actual change of heart?
01:33:33.000 Yeah, no.
01:33:34.000 No journalist whining about, you know, intense content on the internet has ever de-radicalized anybody.
01:33:43.000 They have no track record.
01:33:45.000 They have no data.
01:33:46.000 So it's just all emotional.
01:33:48.000 In fact, it polarizes some people that disagree with them.
01:33:51.000 Yeah.
01:33:51.000 Especially when those people get banned.
01:33:53.000 If they get banned from social media platforms for having different perspectives or different views.
01:33:58.000 Well, for instance, sorry, Vice did a piece about us.
01:34:02.000 And they said, Mayans has no idea what to do with all the neo-Nazis.
01:34:07.000 And just, like, I talked to these reporters for hours and explained to them what we were working on with Daryl.
01:34:13.000 And we were sort of in the beginning of phases of writing this paper.
01:34:16.000 And they so disingenuously characterized what we were trying to do.
01:34:20.000 There's a lot of bad-faced conversations over there.
01:34:22.000 It's so...
01:34:23.000 It's toxic.
01:34:25.000 And, you know, I'm just hoping that, honestly, no offense to them.
01:34:28.000 I feel like they're...
01:34:30.000 They're in their world.
01:34:31.000 Hopefully we can all get on the same page somehow about what's actually going on here.
01:34:36.000 I'm not trying to have a combative tone with any of these media outlets or with big tech even.
01:34:44.000 I don't want to polarize it between alternative tech and big tech.
01:34:47.000 It's like we need tech.
01:34:50.000 To adopt certain principles that have to do with digital rights and freedom.
01:34:55.000 That's just a reality.
01:34:56.000 It has to happen.
01:34:57.000 And the ones that do, what would be smarter than whether it's Google, Facebook, Twitter, whoever, to actually start doing some of this stuff and start to be more transparent?
01:35:08.000 I think the amount of moderation that they would require Would be extraordinary.
01:35:13.000 You can achieve it with community-centric moderate.
01:35:17.000 Pay the users to help.
01:35:18.000 Yeah, but they're not going to do that.
01:35:20.000 They will.
01:35:20.000 They are.
01:35:21.000 They do it.
01:35:21.000 You think so?
01:35:21.000 Yes.
01:35:22.000 Who's going to do that?
01:35:22.000 Facebook?
01:35:24.000 Anyone.
01:35:24.000 Yeah.
01:35:25.000 Twitter actually rolled out- They're going to pay users to moderate.
01:35:27.000 They should.
01:35:28.000 They don't- But they're not- You're saying they're gonna.
01:35:30.000 I'm saying they're not.
01:35:31.000 Well, I'm saying that Twitter rolled out a product called Birdwatch.
01:35:35.000 Which was a – and I don't know if it's still going on, but this was like last year.
01:35:40.000 It was a community-centric moderation tool to get the – so let's separate payments from actually getting the community involved in the moderation.
01:35:48.000 So communities are already heavily involved in moderation.
01:35:51.000 They're doing the reporting.
01:35:52.000 They're flagging stuff.
01:35:54.000 And then it's getting escalated through – Yeah, but they flag things that aren't even really offensive.
01:35:58.000 And that's why – They do it to fuck with people.
01:36:00.000 Right.
01:36:00.000 And so you have to be careful of that.
01:36:02.000 But that's why juries are – I think that juries are a big part of the future of moderation on social media.
01:36:10.000 And Darrell, you were about to say something.
01:36:12.000 So, you know, a lot of hypocrisy about who to put on a platform, who not to put on a platform.
01:36:18.000 I do a lot of speaking to a lot of colleges across the country, universities.
01:36:22.000 And I would say two or three times a year, you know, some student activities board or student council has booked me.
01:36:30.000 And then two weeks before the event, the administration will shut it down.
01:36:33.000 Oh, no, no, we can't have him on campus.
01:36:35.000 He's too controversial.
01:36:36.000 Stir stuff up.
01:36:37.000 Which is not true.
01:36:39.000 At all.
01:36:40.000 They don't want to deal with it.
01:36:42.000 And this is unfortunate because they are an institution of higher learning.
01:36:46.000 While on the campus, perhaps everybody is being treated equally.
01:36:50.000 Gay people, LGBTQ, black, white, Muslim, Jewish, whatever.
01:36:56.000 In the confines of the campus, but the objective of higher education is to teach people how to navigate society beyond the campus and be a productive citizen, right?
01:37:08.000 So you got to let people learn that, hey, you're a woman.
01:37:12.000 Here, you're treated equally.
01:37:13.000 But when you graduate and you go out there and work in the real world, you might be sexually harassed by your boss.
01:37:19.000 You might not get paid as much as your male counterpart who knows less than you or whatever.
01:37:23.000 Or you might not get the job because you're black or because you're whatever.
01:37:27.000 This is, you know, in addition to the academic education, they need this empirical education.
01:37:33.000 And those institutions that are shutting me down are not providing it.
01:37:37.000 But what I was going to say also was today you got – and speaking of cancel culture, you got people banning books and banning history classes under the guise of CRT, critical race theory, things like that.
01:37:49.000 You've seen the pictures of a black girl walking towards a white school building for the first time, people behind her yelling at her and all that kind of stuff, or the four black guys sitting at the Woolworths counter in Greensboro and people pouring stuff over their heads.
01:38:04.000 1960s.
01:38:05.000 Exactly.
01:38:07.000 Those white people that did this made history back then, and now it's those same people that are saying, we don't want that taught in the schools.
01:38:18.000 So make up your mind.
01:38:19.000 You know, that is history.
01:38:21.000 It's part of American history.
01:38:22.000 Whether it's good, bad, ugly, or shameful, all those cars need to be turned face up.
01:38:27.000 And it'll be transparent, and then we address them, and then we move on together.
01:38:32.000 Okay?
01:38:32.000 But history is history.
01:38:34.000 So don't create history and then tell me you don't want that history being taught that you created, that you were so proud of.
01:38:41.000 You know, I'm going to stand in the doorway and not let these black kids come in.
01:38:45.000 I think it's about, is there a neutral lens that we can look at those events?
01:38:50.000 And I think that some of the criticism of CRT is that it's not approaching those events from a neutral lens.
01:38:59.000 And it's not that, you know...
01:39:00.000 What's neutral about police dogs attacking peaceful black marchers on the way to the courthouse to register to vote?
01:39:06.000 There's nothing neutral about it, but I think that there's definitely some ideology that is attached to critical race theory that is rooted in critical theory, which is a left-leaning...
01:39:21.000 There are multiple definitions of critical race theory, and nobody has really explained it.
01:39:30.000 You're trying to victimize white people as the oppressors and victimize black people as the oppressed, and that's how you are, and you will never change.
01:39:40.000 That's how the people who are opposed to it define it.
01:39:44.000 But, you know, but that's not necessarily how some of the people who participated in the creation of it, like Kimberly Crenshaw, I can speak to all of them, define it, you know?
01:39:53.000 So it needs to be, all history needs to be taught, you know, and through the lens of what happened.
01:40:01.000 And then move forward.
01:40:03.000 But you can't create history and say, you know, we don't want to talk about it until 50 years later.
01:40:07.000 Like when I was in high school, I'll be 64 this month.
01:40:10.000 When I was in high school, we did not learn.
01:40:12.000 And I went to high school in Montgomery County, Maryland, which has one of the top school districts in the whole country.
01:40:18.000 Montgomery County, Maryland and Fairfax County, Virginia.
01:40:21.000 We tie neck and neck each year.
01:40:23.000 Anyway, we were not taught that we had Japanese internment camps in this country.
01:40:27.000 I did not learn that until I was in college.
01:40:30.000 I'm like, what?
01:40:31.000 Are you kidding me?
01:40:31.000 No way.
01:40:32.000 I asked my parents.
01:40:33.000 They said, yeah.
01:40:34.000 I could not believe I didn't learn that in high school.
01:40:36.000 Now, I knew about the Tulsa race riots 30 years ago.
01:40:39.000 People today are just now learning about that.
01:40:41.000 So, you know, that's what I'm saying.
01:40:43.000 We need to educate.
01:40:45.000 Education exposure is the key to advancement.
01:40:48.000 Well, what we need to do is your take on the way you've had these conversations with these KKK people and these neo-Nazi people.
01:40:55.000 That has to be across the board with everything.
01:40:58.000 Let a person explain their position and then you come up with either a better argument or you agree with part of what they're saying or...
01:41:08.000 The only way is to not silence them, to let them talk.
01:41:12.000 So if people are against critical race theory for any particular reason, they should listen to the entire argument of what critical race theory entails from at least that person's perspective, and then this is what I agree with, this is what I don't agree with,
01:41:27.000 and have a conversation that's Rational.
01:41:30.000 They're not having ad hominems.
01:41:33.000 They're not attacking the human.
01:41:34.000 They're not attacking the person with insults.
01:41:36.000 They're just talking about what is correct and incorrect about everything from economics to health care to everything.
01:41:45.000 These kinds of conversations are how people find out what's right and what's wrong and how people find out what resonates with them.
01:41:52.000 And as soon as you shut people down, those conversations stop.
01:41:56.000 And then these people go off into their own corner of the world where they are accepted, and they get in an echo chamber, and they just reinforce whatever stupid idea they had in the first place.
01:42:05.000 Yeah, what you were saying about watching people change their minds, like their interviews, that is so powerful.
01:42:10.000 And we're actually watching this Change Minds.
01:42:14.000 Sort of challenge where we're going to be trying to, as a campaign on the site, to have people make videos and tell stories of a meaningful time that they changed their mind.
01:42:24.000 Because everybody doing that more, what's a recent time you've changed your mind about something sort of meaningful?
01:42:33.000 Oh, I don't know.
01:42:33.000 It happens all the time, though.
01:42:35.000 Right, it happens all the time.
01:42:36.000 It's not only a woman's prerogative to change her mind.
01:42:38.000 No, I change my mind all the time.
01:42:40.000 I'll change my mind in the middle of a conversation.
01:42:42.000 I'll go, wait a minute.
01:42:43.000 I don't think so.
01:42:44.000 Let me change.
01:42:45.000 I'm going to change my mind right now.
01:42:47.000 I do that all the time.
01:42:48.000 But can you think of something in your life from when you were younger that you were really locked into?
01:42:54.000 What's just a big one?
01:42:55.000 Oh, I don't know, man.
01:42:56.000 I've had so many of them.
01:42:57.000 This conversation would take 15 minutes for me to sit down and think about it.
01:43:00.000 All right, Daryl, you got one?
01:43:02.000 Yeah, I can give you one.
01:43:04.000 Okay, so as a kid...
01:43:07.000 I learned that a tiger does not change its stripes.
01:43:09.000 A leopard does not change its spots.
01:43:12.000 And so when I first went in to interview white supremacists and KKK people or whatever, I was not going there to convert them.
01:43:22.000 Never.
01:43:23.000 Okay?
01:43:23.000 All I wanted to know was, how can you hate me when you don't even know me?
01:43:26.000 That's all I want to know.
01:43:26.000 And then I'm out of here.
01:43:27.000 I'll never see you again.
01:43:28.000 Okay?
01:43:30.000 Because if a leopard cannot change his spots and a tiger cannot change his stripes, why would I think that a Klansman could change his robe and hood?
01:43:39.000 It's who he is.
01:43:40.000 Right?
01:43:41.000 But I changed my mind because those conversations did change that person.
01:43:47.000 And you're right, a leopard cannot change its spots and a tiger cannot change its stripes because those two animals were born with those spots and stripes.
01:43:58.000 That Klansman or Klanswoman was not born with that robe and hood.
01:44:02.000 That was a learned thing.
01:44:04.000 And what can be learned can be unlearned.
01:44:06.000 So that's why I changed my mind and why I continue to do this today to sit down with those people.
01:44:12.000 And the only way that works is with open dialogue.
01:44:14.000 Exactly.
01:44:15.000 I mean, it's funny that you answered it like that because for you it's just second nature to constantly be changing.
01:44:21.000 I have a philosophy about that.
01:44:22.000 I don't think you should ever be your ideas.
01:44:25.000 You should never be connected to your ideas.
01:44:27.000 Your ideas should be something that's independent of you that you either think this is a good one or this is a bad one.
01:44:33.000 But if someone comes along and says that's a bad one, you shouldn't be defensive.
01:44:37.000 You shouldn't like hold on to it and cling to it.
01:44:39.000 Maybe like try to defend it because you think it's correct.
01:44:42.000 Like, oh, I thought that was right.
01:44:43.000 But then once it isn't, there's some people that for whatever reason never want to admit they're wrong because they think that being wrong makes them less.
01:44:53.000 Yeah, to play devil's advocate with ourselves, I mean, I'm not even ideological about our model.
01:45:03.000 I actually think that I'm open to seeing, you know, over the course of 10 years, like, let's actually come back in a few years and look at the data that...
01:45:15.000 The information that we've gathered about the rate of deradicalization and whatnot.
01:45:19.000 Like, what really works?
01:45:22.000 What is the most balanced moderation policy for a social network?
01:45:26.000 Like, you know, First Amendment, I think, is a great starting point.
01:45:29.000 And obviously there's edge cases, spam, weird, like, there's this...
01:45:33.000 Doxing.
01:45:34.000 Yeah, doxing.
01:45:35.000 That stuff we don't deal with.
01:45:36.000 And that's not covered in the First Amendment.
01:45:39.000 But I think that we're flexible.
01:45:44.000 This isn't like a dogmatic piece of policy.
01:45:52.000 But we need to A-B test it, at least.
01:45:54.000 I mean, for God's sakes, like, big tech is just, like, hemorrhaging censorship, just like...
01:45:59.000 People, just millions of people getting banned a day.
01:46:02.000 And we don't have something to test it against.
01:46:06.000 Like, where's the major network with a free speech policy that is, you know, it's a responsible free speech policy.
01:46:12.000 Let me ask you this.
01:46:13.000 What do you guys do about bad actors, like troll farms, like Russian troll farms, that kind of thing?
01:46:19.000 Yeah, I mean, so we have, um, I mean, so troll, like, detecting different types of spam and troll, like, harassment is not okay.
01:46:30.000 Uh, harassment, you know, you'll get banned.
01:46:32.000 Um, like, and harassment is, like, legally not allowed.
01:46:38.000 But there's all different types of spam.
01:46:41.000 And with misinformation and whatnot, I think that...
01:46:44.000 Yeah, but I'm talking about bad actors.
01:46:46.000 I'm talking about when you have these Russian troll farms, these are people that are hired to disseminate propaganda.
01:46:53.000 They're hired to muddy the waters of conversations by having fake arguments or bad faith arguments.
01:47:00.000 Mercenaries.
01:47:01.000 Yeah, they are.
01:47:02.000 You've seen those, right?
01:47:03.000 So we have the distinction between misinformation and disinformation.
01:47:07.000 The difference is that disinformation is intentional manipulation.
01:47:13.000 I think that it really depends on the context of the specific post that we're talking about.
01:47:19.000 So I don't want to make a generalization about There's troll farms in the US that are doing all kinds of inauthentic content engineering for different political purposes.
01:47:29.000 It doesn't matter where the continent is from.
01:47:31.000 What do you do about it?
01:47:32.000 So what I'm saying is you look at it on a case-by-case basis and evaluate, you know, is it breaking the law?
01:47:40.000 Because at the end of the day, information is information.
01:47:44.000 If someone is trying to put...
01:47:45.000 Everything is propaganda.
01:47:47.000 Propaganda is coming from, you know, every single angle.
01:47:50.000 So it depends on the specific nature of the content and the troll that you're talking about.
01:47:55.000 I don't think that you can have a blanket solution saying...
01:47:58.000 Programming an AI to say, hey, every time, you know, you detect X, Y, and Z, just like...
01:48:04.000 Well, wouldn't an alternative be everyone has to have, like, a user ID, like a driver's license to register?
01:48:10.000 So you have one account because you are Bill Ottman?
01:48:13.000 Well, I think that's where the decentralized reputation is starting to come into play.
01:48:16.000 And there's this project, Verite, that's coming out.
01:48:19.000 There's the DID spec.
01:48:21.000 Which is starting to build this interoperable identity that you carry between social networks.
01:48:28.000 So basically you're bringing your credibility, your identity, whatever you want to share, whether it's Art, you know, content, it's all tied to you and you're sort of moving around freely in a sovereign situation.
01:48:45.000 I think that's where we want to go long term so that you're not locked in.
01:48:49.000 As technology evolves, so should ideology.
01:48:55.000 Yes.
01:48:55.000 Yeah, ideology should also, like, it needs technology.
01:49:01.000 Because your ideology should be tested.
01:49:04.000 And the best way to test your ideology is to have it encounter other ideologies and see if it stands up to scrutiny.
01:49:10.000 Exactly.
01:49:11.000 And the thing, when people don't want that, and they want people censored, what you're saying is your ideas won't hold up.
01:49:19.000 Because you don't want to, if we could all have debates in real time with good ideas versus bad ideas, and everyone gets a chance to watch, it's gonna be messy.
01:49:31.000 But at the end, you're gonna at least know where you stand.
01:49:37.000 Because you've had both arguments played out in front of you, whether it's left versus right or whatever it is when you're talking about ideologies.
01:49:45.000 You've got to watch these people have these conversations.
01:49:50.000 And if you can do that, you can kind of agree with one or disagree with the other and find out where you fit in this.
01:49:57.000 Or take something good from that person, something good from that person, and put them together.
01:50:01.000 Yes.
01:50:02.000 I think the focus on long form is key.
01:50:04.000 Yes.
01:50:05.000 And that's why, you know, so we do support video.
01:50:07.000 It's not, like, necessarily...
01:50:09.000 Do you host video?
01:50:10.000 Yeah, we do.
01:50:10.000 Yeah.
01:50:11.000 So someone can do, like, an hour-long video and upload it?
01:50:15.000 Yep.
01:50:15.000 Your bandwidth cost must be extraordinary.
01:50:17.000 Oh, God.
01:50:17.000 It's bad.
01:50:18.000 Yeah.
01:50:19.000 But there are, like, distributed systems, like IPFS that I mentioned, and Arweave, and some of these, like...
01:50:26.000 Systems where it's decentralized and, you know, you don't have to pay for all of the storage, but the bandwidth is still an issue and, you know, it's a spectrum with the decentralized stuff.
01:50:38.000 But, yeah, so, dude, I have this cool thing.
01:50:42.000 It is...
01:50:45.000 Daryl is out of time.
01:50:46.000 Daryl's out of time.
01:50:47.000 I just gotta wrap up with...
01:50:49.000 So this is called an Open Dime Bitcoin wallet.
01:50:55.000 So this has a Bitcoin on it.
01:50:58.000 This has a full Bitcoin on it.
01:50:59.000 Okay.
01:51:00.000 And it has this hole that you can puncture.
01:51:03.000 So I can...
01:51:04.000 This is basically the cash equivalent.
01:51:05.000 It's a bearer instrument for Bitcoin.
01:51:08.000 So I can hand this to you.
01:51:11.000 Okay.
01:51:11.000 I'm not giving it to you, but just hold it.
01:51:13.000 We can see it up there.
01:51:14.000 Yeah.
01:51:15.000 So it works as like a USB drive?
01:51:17.000 So yeah, you plug in your computer and you can send Bitcoin to it and then you got to puncture that hole and that's what unlocks the private key.
01:51:25.000 So I can give it to you and you cannot access the Bitcoin on this until that hole is punctured.
01:51:30.000 And then you plug it back in and you can actually take control of the Bitcoin.
01:51:33.000 What does that have to do with censorship?
01:51:34.000 So, well, this is the ultimate censorship-resistant crowdfunding mechanism.
01:51:39.000 This is totally uncensorable money that anyone could send crypto.
01:51:43.000 Right, but we're talking about discussions, conversations.
01:51:46.000 Well, yeah, and so the reason I'm bringing it up is because we are putting a full Bitcoin towards, you know...
01:51:53.000 Our work with Daryl.
01:51:54.000 And we're going to have this basically sit.
01:51:59.000 And we're going to watch it over the years.
01:52:01.000 And we're going to use the funds.
01:52:03.000 The address for this wallet is published on minds.com slash change.
01:52:07.000 And so what we want to do, you know, you see all the censorship, the financial censorship happening, which is correlated to censorship of speech.
01:52:16.000 Google has now suspended monetization on YouTube for all users in Russia.
01:52:22.000 Applies to other services as well.
01:52:24.000 That's a whole creator industry up in smoke.
01:52:27.000 No way these guys can make up that revenue on...
01:52:31.000 It's unbelievable, yeah.
01:52:33.000 So I just wanted to bring this up because we're going to be doing more events.
01:52:38.000 We believe in offline events too.
01:52:41.000 It's not only an online social network.
01:52:43.000 And so if the address, if anyone is interested in supporting the conversations, the long-form conversations we're having with Daryl, Please, you know, send Bitcoin to this address, and we're going to put it towards that.
01:52:57.000 Thank you.
01:52:58.000 Thank you guys for coming here, and thank you, Daryl, for all of your time and effort that you put into this.
01:53:03.000 It's extraordinary.
01:53:04.000 I mean, your patience is unbelievable.
01:53:06.000 So is yours, my friend.
01:53:08.000 All the stuff you gotta put up with.
01:53:10.000 Yeah, well, Mike, that's what I do, though.
01:53:13.000 I guess that's what you do as well.
01:53:14.000 And, Bill, thanks for what you're doing with Minds.
01:53:16.000 Let's figure out what fuckery was going on and fix that on your account.
01:53:21.000 Let's figure out.
01:53:22.000 Okay.
01:53:22.000 All right.
01:53:23.000 Thank you, everybody.
01:53:24.000 I'll tell everybody one more time the Minds address slash change.
01:53:29.000 Yeah, Minds.com slash change.
01:53:30.000 You also get the app, Minds.com slash mobile, or me at Atman.
01:53:34.000 And, Darrell, what social media are you using other than Minds?
01:53:38.000 I use daryldavis.com, daryldavis.com.
01:53:42.000 I'm on Twitter, Instagram.
01:53:45.000 I have somebody handling that for me.
01:53:47.000 But most of you use Minds now.
01:53:49.000 Yeah.
01:53:50.000 I use Minds.com.
01:53:51.000 Also FAIR, the Foundation Against Intolerance and Racism.
01:53:55.000 Okay.
01:53:55.000 Fairforall.org.
01:53:56.000 Well, let's do this again in the future when we have more time.
01:53:59.000 Thank you, Joe.
01:53:59.000 All right.
01:54:00.000 Thanks, Ben.