In this episode of the Joe Rogan Experience podcast, I sit down with the founder of Minds, a social network committed to free speech and open source. We talk about the new landscape of alternative networks emerging in the space, the importance of open source, and why open source is the key to the future of the space. We also discuss the Ku Klux Klan and Daryl Roth's recent conversion to Christianity. I hope you enjoy this episode, and don't forget to check out the companion YouTube channel, where more of the same. If you're interested in learning more about alternative networks and alternative social networks, then this is a must-listen episode for you! Thanks for listening, and thanks for supporting the show! -Jon Sorrentino is a musician, writer, podcaster, and podcaster who has played in over 60 countries and travelled the world playing music all over the world. He's also a regular contributor to the New York Times, NPR, and other media outlets. His music has been featured in many publications, and he's one of the most influential publications in the world, including The Financial Times, The Huffington Post, Forbes, Billboard, and The New York Post. The New Statesman, and many other publications. He's been described as a "rock god" and a "singer-songwriter." and "songwriter-in-chief". - he's also known for his guitar playing, singing, writing, and being a lot of music, he's a great conversationalist. - He's music, and his music is widely appreciated by the masses. . And he's an even better than you can find him on social media, too. His music is great, and you should listen to his music on YouTube, too, if you don't know who he is. Thank you for listening to this episode. I really enjoyed it. -- it's a must listen! --Jon Rogan Podcast by day, Jon Rogan podcast by night, all day all day long. Joe Rogans Podcast by night all day, by night? -- by night by day and night by night and night night, by day by day -- thank you for having him back? thank you so much for listening -- Thank you, Jon, I love you for being here? and I'm so grateful for you?
00:01:27.000So I absolutely respect any network that is putting forward a free speech policy.
00:01:32.000But if you can't have free speech policy with sketchy algorithms and closed source code because then we don't know if you're soft censoring, shadow banning.
00:01:44.000We don't know what's happening in the news feed behind the scenes.
00:02:45.000We want to make it impossible for us to even take down our network at all.
00:02:50.000And that's why immutable distributed systems like blockchains and TOR and all of the IPFS, all of these different decentralized systems are emerging.
00:03:28.000It's more than 200. Ku Klux Klan members, neo-Nazis.
00:03:33.000I mean, we talked about these guys giving you their Klan outfits and retiring because they met you.
00:03:41.000And just because you had reasonable conversations with them made them realize how stupid these ideologies that they had somehow or another been captivated by.
00:03:51.000I mean, at the end of the day, you know, a missed opportunity for dialogue is a missed opportunity for conflict resolution.
00:04:13.000So all that is to say that I've been exposed to a multitude of skin colors, ethnicities, religions, cultures, ideologies, etc.
00:04:20.000And all of that has shaped who I've become.
00:04:22.000Now, all that travel does not make me a better human being than somebody else.
00:04:26.000It just gives me a better perspective of mass humanity.
00:04:32.000And what I've learned is that no matter how far I've gone from our own country, Right next door to Canada or Mexico or halfway around the globe, no matter how different the people I encounter may be, they don't look like me, they don't speak my language, they don't worship as I do or whatever, I always conclude at the end of the day that we all are human beings.
00:04:50.000And as such, we all want these same five core values in our lives.
00:05:01.000We all want to be treated fairly, and we all basically want the same things for our family as anybody else wants for their family.
00:05:07.000And if we learn to apply those five core values when we find ourselves in an adversarial situation or a culture or society in which we're unfamiliar, I can guarantee that the navigation will be a lot more smoother.
00:05:21.000And essentially, that's what's happening here at Mines.
00:05:41.000And essentially, it could become a breeding ground for a cesspool of nefarious activities, whether it's extremism or violence or conspiracy theories or what have you.
00:05:50.000So it seems like there's an issue with many social media companies where they want to censor bad ideas.
00:05:58.000It seems to me that part of that is because the work involved in taking a person who's a neo-Nazi or Ku Klux Klan member and showing them the error of their ways, allowing them to spread their nonsense, and then slowly but surely introducing them to better ideas,
00:06:22.000So what Twitter does is like, fuck you, get out of here.
00:06:24.000What Instagram does, the same thing with all these people.
00:06:26.000But the problem with that is then it goes further and further and further and further down where you're getting rid of people for just not agreeing with you.
00:06:36.000So Daryl and I just wrote this paper called The Censorship Effect, along with Jesse Morton, Justin Lane, Leron Schultz, and my brother Jack, and the multiple PhDs, like serious research has gone into this.
00:06:52.000Even the left out of outlets like Vox are now admitting that deplatforming This is being admitted across the board.
00:07:22.000I mean, because these are very smart people that work at big tech sites.
00:07:29.000I don't think they're intentionally causing it.
00:07:31.000I think, first of all, there's an ideology that is attached to all the big tech companies, whether it's Google or Facebook or Twitter.
00:07:40.000You have to be what they think is woke, right?
00:07:43.000You have to subscribe to a certain line of thinking And anybody that deviates from that line of thinking should be suppressed, minimized, or banned.
00:08:54.000He like sips it, it touches his lips, and then he's done.
00:08:57.000He's like, everything is like measured, measured.
00:09:00.000Like, I can't imagine trying to speak freely when you're the CEO of Facebook.
00:09:05.000I think it's almost like pointless to talk to him in that sort of circumstance.
00:09:09.000Well, you know, to your point, About, you know, people doing this and defending it and so forth and so on.
00:09:15.000I mean, I think the quote by Upton Sinclair comes into play.
00:09:18.000I think he said something to the effect of, it's difficult for a man to understand something when his salary depends upon him not understanding it.
00:09:28.000And if you live in that world, if you live in that tech world, and I have many friends who have, you know, they're executives at these places.
00:09:38.000You have so many employees that they have these radical ideas about what you're supposed to do and not supposed to do, and what you're supposed to platform and not platform, and this idea of platforming people.
00:09:51.000I have people on this podcast all the time that I don't agree with at all, or I agree with them very little, and I want to see what's going on in their head, and I'll get that.
00:10:07.000And they have a right-wing ideology that I don't think should be suppressed.
00:10:13.000I think you should try to pick it apart.
00:10:15.000You cannot change someone's mind if you do not platform them.
00:10:18.000It is impossible for someone with horrible ideology to change.
00:10:24.000I should say, not just a right-wing idea.
00:10:27.000There's a lot of people with left-wing ideologies that I think are ridiculous, and I want to pick those apart, too.
00:10:32.000I want to have conversations with people, and this idea that you're only supposed to have conversations with people that you absolutely agree with, and that what you're doing is just broadcasting these ideas to better humanity.
00:10:43.000If you want a better humanity, have fucking conversations with people.
00:10:46.000Look, you know, this goes all the way back, I mean, centuries, even back to BC, as in before Christ, right?
00:10:53.000I mean, we can go back as far as, let's just say, Copernicus, the astronomer who passed away in 1543, okay?
00:11:03.000Up until then, the belief was that we are a geocentric model universe.
00:11:15.000So even the Catholic Church endorsed that we are a geocentric model universe, meaning that the earth is the center of the universe and everything revolves around us, right?
00:12:02.000So, you know, sometimes we have to stand up to the masses, not just join in because everybody else thinks this way.
00:12:09.000And it's also the problem of the walled garden, right?
00:12:12.000There's a lot of people that get booted from these social media platforms, whether it's Twitter or Facebook, and then they look at that and they look at those people with further and further disdain and it separates them.
00:12:26.000And we're not even just talking about radical people.
00:12:29.000One of the things that really alerted me to how crazy the censorship shit was was Brett Weinstein had a group that he put together called Unity 2020. And the idea was to bring people that were from the left...
00:12:42.000That were really reasonable, and from the right that were really reasonable, that weren't captured by corporate greed, and to have them as an alternative candidate.
00:12:50.000Like, instead of saying, like, you have to be a Republican, or you have to be a Democrat, let's, like, get reasonable left-wing and right-wing people that can agree on a lot of stuff and have them work together, and maybe you have a candidate that's, like, a vice president and a president, one's right-wing, yeah, like, it would be a great way to sort of,
00:13:10.000Twitter banned an alternative account.
00:13:13.000There was nothing unreasonable about what they were saying.
00:13:17.000It was all just conversations with people that are brilliant that happen to be left-wing and brilliant that happen to be right-wing.
00:13:24.000Let's get them together and see if we could lead this country in a better direction than having this polarization of right versus left where people get super tribal about it.
00:13:32.000Like, this would be a great way to meet in the middle.
00:14:37.000I think there was a real concern in the early days of Twitter and of social media where a lot of these people that were outrageous right-wing people We're starting to get a lot of attention, like Milo Yiannopoulos was a big one,
00:14:54.000Gavin McGinnis, and a lot of these guys, they were getting a lot of attention, and the response from the left was like, no, no, no, no, silence them!
00:15:03.000Like, I heard this one woman talking about her kid is listening to Ben Shapiro, and I would love to get Ben Shapiro removed from all platforms.
00:16:01.000And it's clearly shown that certainty accelerates with deplatforming based on whatever you were thinking before.
00:16:10.000So isolation and certainty have an overlap.
00:16:14.000Yeah, so if you have an idea, like especially with something as innocuous as Unity 2020 or beneficial, the idea of unity, I mean, come on, it's like literally in the title.
00:16:33.000You can't be a part of the problem because you're going to draw votes away from the people that we think it's imperative that they win.
00:16:40.000So it changes the whole idea of what democracy is because they're kind of admitting that they have an influence on the way our elections go.
00:16:49.000You know, I mean, and speaking of unity, you got those people who are out protesting every day, you know, to help change and bring people together, but a lot of them are the very ones who will not sit down and talk with the person that they're protesting against, you know?
00:17:05.000So how badly do they really want unity?
00:17:08.000Well, this happened to us, personally.
00:17:13.000I started seeing it, I mean, I guess it was a couple of decades ago.
00:17:17.000You started to see when someone was a controversial speaker, they would come to a university, and instead of someone debating that person or someone, you know, listening to that person's ideas and picking them apart, Instead, they were like pulling fire alarms and shouting people down and screaming at the top of their head in the middle of the auditorium.
00:17:36.000They're silencing people's ideas because they feel that their ideas are better, which is exactly the opposite of what the Founding Fathers were trying to sort of manage when they came up with the First Amendment.
00:17:51.000We're really trying to make this less of an emotional debate because I think the censorship and speech stuff is obviously very emotional.
00:17:59.000We're talking about a lot of horrible stuff that hurts people personally.
00:18:02.000And so the big tech strategy is, oh, we care about people's feelings and we want to...
00:18:08.000Hide this information because it's offensive, but we need to remove the emotion and look at this empirically in terms of what is actually making society more healthy and what is actually preventing radicalization and violent extremism.
00:18:23.000There's a difference between radicalization and violent extremism.
00:18:26.000So, if we can prove to big tech that deplatform—we want them to adopt a free speech policy.
00:19:01.000Let me ask you this, like for Mines, like say if someone starts like a neo-Nazi group and they start posting on Mines and they start talking about the master race and eliminating Jews and crazy Nazi type shit, what do you do?
00:19:17.000Oh, I mean, as long as it's not calling for violence or having true threats of violence, then it will go under an NSFW filter.
00:19:26.000So it will go under sort of a, you know, it'll have a sensitive kind of click through.
00:19:34.000So it's like one of those Instagram videos, like if you see a car accident or something like that, there's Instagram videos you have to say that you're willing to see this offensive thing.
00:20:02.000So we're rolling out this system where the community can sort of help create consensus around tags on different content.
00:20:08.000And, you know, if we make a mistake, it can get appealed.
00:20:11.000And the community actually votes, not us.
00:20:14.000And Darrell, your take on this is like, how do you think that a social media company like Twitter, something that's really huge, can pivot from the model that they have now, where they just ban people?
00:20:29.000Because, you know, that points to them assuming that the majority of people out here are stupid and that these companies need to tell you what to believe, which to me is offensive.
00:20:44.000I believe, you know, yes, there's a lot of bad information out there.
00:20:48.000And, you know, the more liberal you make your platform, allowing anybody and everybody to come in, yeah, you're going to have some bad actors, sure.
00:20:54.000But the way you address it is you combat bad information by providing more good information.
00:21:04.000So Clarence Thomas, Supreme Court Justice, came out and he said that he thinks Networks above a certain size should be considered common carriers.
00:21:15.000There's this whole debate about Section 230 and whether networks have a right to take things down.
00:21:20.000It's pretty definitive that big social networks, private companies do have the right to moderate.
00:21:45.000Yeah, I know Jack Dorsey had an idea of two versions of Twitter, a curated, moderated version of Twitter and then a Wild West version.
00:21:53.000And he was trying to pitch that and I think they shot him down.
00:21:57.000But his idea was like we should have some Twitter that's like got some sort of decentralized control where it's not up to the moderators to decide what's on it and people can just put on whatever they want.
00:22:11.000Yeah, he launched a project called Blue Sky, which is sort of a research initiative into decentralized social media, kind of very much in our space.
00:22:21.000And then he left, like, two days after he left, there was this huge censorship issue where they said, oh, you can, if it's a private image, it can get taken down on Twitter.
00:22:30.000So, like, any private image of anybody.
00:22:32.000Oh, after they left, after he left, they ramped up censorship in a big way.
00:22:36.000And it seems like, I mean, it's a hard position to be in because, you know, it's like your baby.
00:22:41.000This is a company he's been working on forever and he doesn't want to badmouth it.
00:22:44.000But I would not be at all surprised if there were some internal wars happening about, I mean, there's a huge wired piece about internal free speech wars in Twitter management.
00:22:54.000So it's a fact that it's not, you know, one single ideology in these companies.
00:23:00.000There's definitely an overwhelming ideology, but I think that there is starting to be pushback.
00:23:06.000Yeah, there's some intelligent people that realize the error of their ways and that this whole thing is going in a negative direction.
00:23:11.000And Daryl, how did you get involved with Bill& Mines and, like, what was your idea going into this?
00:23:18.000Well, Bill had contacted me after seeing me on some interview or reading about me or something to participate in an event he was originally going to have in New Jersey, then it got moved to Philadelphia.
00:26:20.000He was a guy who was deeply embedded in this sort of Islamic group.
00:26:25.000And then went to jail and realized why he was in jail, started reading, sort of examining this thought process, and came out of it this sort of brilliant mind to analyze what's the process where people get radicalized?
00:26:58.000And that's what, you know, and that's what Mines has in terms of doing the research.
00:27:01.000You know, we've done like a polymath, a 360, digging from all different genres, psychologists, former extremists, trolls, all kinds of people, people like myself with boots on the ground dealing with current extremists and things like that.
00:27:17.000So all of that comes, you know, into the conclusion of this paper.
00:27:21.000Unlike a lot of other papers, they talk about why people do this.
00:27:26.000Others talk about the effect of what they've done.
00:27:29.000Some talk about the cause and the effect.
00:27:31.000But we have the cause, the effect, and the solution.
00:27:35.000It's just hard to get people to jump onto a new social media network.
00:28:25.000But also, Facebook and Google use the dirtiest tricks in the book to grow.
00:28:29.000I mean, they literally latched their tentacles into everybody's phones, grabbed all their contacts, like, you know, followed you in your browser.
00:28:37.000Like, every surveillance tactic they could get to grow.
00:28:42.000So there are these sort of dark growth hacking tricks that a lot of apps will use to increase their user base.
00:28:51.000And it's basically like manipulative growth techniques to get people to give them more information than you otherwise would.
00:28:59.000So let's just say I'm a person who's never used Facebook before and I just got a new phone and I said, you know what, I'm going to download Facebook.
00:30:14.000So like if I have you on my phone and I sign up for Facebook, does it get Bill Ottman plus your phone number and then they could target you?
00:32:04.000Yeah, so it's definitely listening for certain cues and we don't know the breadth of that.
00:32:11.000So it could be that it picks up certain words that would indicate products that maybe you'd be interested in buying and they show you those ads.
00:32:17.000I was looking in my Google data history like a couple weeks ago And I had, you know, I have a bunch of different phones, actually, which I want to show you later, some, like, new open source stuff.
00:32:32.000So, in my Google data history, it showed when I said certain words, like, that triggered it.
00:32:39.000Like, there was all these different words that are sort of commands for assisting.
00:32:45.000Google Assistant, I think it's called.
00:32:46.000And I had turned Google Assistant off, and yet it still had, it was like on June, or on June 18th, you said, you know, hello, or whatever it was.
00:32:55.000And it was just this whole history, and I just deleted it all and, like, turned it all off.
00:32:58.000And it's, um, they're definitely listening for Q. So even if you say no, you opt out.
00:36:43.000Use our unique hardware kill switches to physically disconnect Wi-Fi, Bluetooth, cellular signal, microphone, and camera with kill switches.
00:36:52.000Yeah, because a lot of it has to do with the chain of custody of these products because proprietary surveillance chips will get added to the phone in its life cycle throughout the factories globally.
00:37:04.000So they're saying, look, we need to make sure to be able to commit to our customers that there's no sketchy chips on this thing that's feeding data to some...
00:37:14.000And that was the big thing with Huawei, right?
00:37:17.000One of the things about banning Huawei in the United States was they had proven that some of their routers were allowing access from third parties to access the information as it's distributed between the two parties.
00:37:30.000So a third party could come in, scoop up all the intellectual property and just use it.
00:37:37.000And you know, sometimes some of these companies, they work both sides.
00:37:41.000So the ones that create the device to prevent something is the same company that creates the device to take something.
00:37:48.000Like in the Washington, D.C. area, for example, a few years back, D.C. was being sued for the cameras, the red light cameras.
00:37:58.000You run the red light, you get a ticket.
00:38:00.000Well, Lockheed Martin had created those cameras, and they were shortening the length of the yellow light So you got a bigger chance of running the red light, all right?
00:38:11.000So for every ticket that was written, Lockheed Martin was getting a dollar, and the rest of it would go to the D.C. Police Department, right?
00:38:18.000But Hewlett-Packard, you know, the same ones who make the radar gun that you get caught on are the same ones who make the radar detector that we use.
00:38:47.000I'm using- I got this because the cool thing about the Librem is that you can plug this into a monitor with a keyboard, and this is a computer.
00:39:31.000Yes, it's actually connected to the table.
00:39:35.000Adam has this No Agenda podcast, and they have a No Agenda phone, and it's essentially a de-Googled Android phone that removes all of the tracing stuff, all the stuff where, you know, but you can't use navigation on it.
00:39:48.000There's a lot of shit you can't use on it.
00:41:34.000Edward Snowden speaks on it at the bottom.
00:41:36.000It says that software is equally important.
00:41:39.000The iOS and Android operating systems that run on nearly every smartphone conceal uncountable numbers of programming flaws, known as security vulnerabilities.
00:41:50.000That means common apps like iMessage or web browsers become dangerous.
00:45:05.000So that alone is a little bit of a red flag, right?
00:45:09.000Yeah, I mean, Apple is, they try to have this privacy argument, and it's so shocking to me that they try to push that, like, oh, you know, we're not going to let the FBI in, like, trust us.
00:45:19.000And look, Apple makes beautiful products.
00:45:21.000Everybody knows that they're the best designers in the world.
00:48:02.000Nearly as much as I used to probably like 10% of what I used to I deleted most of my accounts I'll check in sometimes because I like to see what's going on and to understand the market but you know I it's I'm not gonna I need to get around to with maps and so I'm gonna as soon as we have an alternative I will do it I'll be the first one in line when someone can put something I'm trying to get all these options in front of me,
00:48:24.000but it seems like Operating systems and applications, the trend is for them to get more intrusive, right?
00:48:33.000Like TikTok is supposedly, they back-engineered it and said it's the absolute worst software that they've ever examined in terms of, like, violating your privacy.
00:50:37.000But doesn't Apple at least give you the option to block advertisers from being accessed your information, block cross-platform or cross-application sharing of data?
00:50:53.000They've been locking down their app store, which has taken billions of dollars away from Google and Facebook advertising because they don't allow apps to do what they used to be able to do.
00:51:43.000Bomber machine that has crazy power and gigantic hard drives and multiple hard drives, way more potent than anything that Apple was selling in the 90s.
00:51:57.000Yeah, for gaming and just for people who do video editing and just people that wanted some crazy, ultra-hyped-up machine, and it would still run the iOS.
00:54:49.000We did a show once at Stubbs, and my friend C.K. brought a bunch of burgers from a bunch of different places, and some of them were plant-based, so I took a bite.
00:58:54.000And so our vision, imagine if rather than tens of thousands of censors who are just going like, down, down, down, take it down, hate speech, misinformation, conspiracy theory.
00:59:07.000What if you had tens of thousands of mental health professionals and positive intervention people and just like people engaging in dialogue who can...
00:59:17.000Provide mental health resources to users who need it to share information.
00:59:21.000I'm not saying you need no moderation.
00:59:23.000You definitely do need a certain level.
00:59:25.000But that's so much money and human energy.
00:59:29.000I mean, you've seen the PTSD studies of these content moderators at Facebook.
01:00:16.000Well, Chinese people have been using that for 2,000 years.
01:00:18.000Would they still be using it 2,000 years later if it wasn't working?
01:00:21.000So now we're accepting, you know, some Eastern culture.
01:00:25.000Now we're, you know, when our doctor does not give us what we...
01:00:28.000Hope will cure us for our cancer, our diabetes, or whatever.
01:00:32.000We go the holistic route, and we've found some pretty amazing results.
01:00:37.000And that's what, you know, Minds is doing, the holistic approach, by giving everybody a platform to share their information, like you just shared about the cabbage juice.
01:00:49.000You know, somebody hears this podcast and goes out and tries cabbage juice, and it clears up their wife's ailment or something like that.
01:00:56.000And this is a good subject to talk about now because we just got through the pandemic and that was one of the things that was suppressed was information about methods of treating COVID. I mean, it was a giant issue where if you talked about whether it's hydroxychloroquine or ivermectin or whatever you would talk about,
01:01:15.000even vitamins, we're talking about like the difference between the COVID results of people that were vitamin D insufficient versus people that had sufficient levels.
01:01:24.000But if you talked about that, you would get in trouble for disinformation or misinformation, and you would get either shadow banned or outrightly banned.
01:01:32.000I mean, there were people that were banned from social networks for suggesting that people who are vaccinated can still spread COVID. That turns out to be an absolute fact now.
01:01:42.000But if you said that eight months ago, nine months ago, instead of Having this conversation and having medical experts debate it and people that understand it and don't understand it, so ask questions and people who are following the standard narrative,
01:01:58.000they express themselves, and then people that have alternative ideas express themselves, and we find out what's right and what's wrong.
01:02:04.000Somebody expressed that it could be treated with bleach, right?
01:02:49.000It's like, you know, somebody's being sort of pregnant.
01:02:53.000I mean, either you're pregnant or you're not, you know?
01:02:54.000But if there's multiple statements about an issue, and some of them are correct and some of them are not, then it would be, like, mostly true.
01:03:00.000Well, take a piece of COVID, you know, content like you were talking about, and, you know, there's going to be studies on one side and another.
01:03:06.000What do you do on minds for that stuff?
01:03:08.000Well, we're building out a sort of citation tool to kind of show the citations on both sides of various arguments and, you know, have more crowdsourced This really gets into the realm of decentralized identity and where we're moving in terms of reputation and credibility on the internet.
01:03:27.000And right now, you've got all these different logins, what we were talking about, where things are going with crypto and with the web standards.
01:03:36.000Really, we're moving towards a place where you have these credentials associated with your core identity, which can be generated from like a crypto wallet or something like that.
01:03:46.000And you'll have all these badges that you're earning everywhere you go.
01:03:49.000And you can decide to disclose those or not disclose those, like NFTs.
01:04:10.000So it's like, if I say, oh, Bill is a very good guy, he says a lot of true things, he's very reasonable, so you get a badge for that?
01:04:19.000There could be any infinite number of, you know, badges that you could potentially earn, but like you could be trusted by, say someone in martial arts trusts you and they give you a signal of trust,
01:04:34.000then that would add to your credibility in martial arts in your decentralized identity on the internet, which would be interoperable between social networks.
01:04:44.000So that there's sort of this web of- Oh look, I got a page.
01:07:08.000And so, when Vox, who are very strongly left-leaning, when they have a piece that they write saying that There are harmful effects of censorship that actually pushes people towards more radical ideas.
01:09:10.000So if you're on Google or Facebook or Twitter, like, yeah, you can silence certain words or topics.
01:09:16.000But when you're thinking of the internet as a whole, then, you know, the total reach is not necessarily going down.
01:09:25.000And we need to start thinking about the internet as a whole, not just isolated networks.
01:09:30.000Like, you can't claim that censorship of COVID misinfo worked When you just banned it from Google and it just went up, like what about the global numbers?
01:10:25.000You can actually get more reach on minds than Facebook or Twitter if you're a small creator.
01:10:32.000Because small creators, like getting out of the void on social is so hard.
01:10:36.000And we have this reward mechanism where you can earn tokens and boost your content.
01:10:41.000And, you know, we also just wrote out this build your algorithm feature where you can actually opt in to see people who are different from you or similar for you.
01:10:51.000Or you can opt in to increase the level of tolerance that you have to ideas that you disagree with.
01:11:13.000Open up your recommendations to not just stuff that's going to bring you down your own echo chamber, but expand it.
01:11:20.000Now, Darrell, I want to talk to you about your personal experience on Mines with what you do, what you're known for.
01:11:27.000Have you had interactions with people on Mines that have been favorable, that you've kind of pushed people into a— Yeah, I've had a few, and I've had my share of detractors.
01:11:37.000Some people think what I'm doing is totally wrong and don't get me or whatever.
01:11:41.000But yeah, I've had interactions with some people.
01:11:44.000When you say people have said it's totally wrong, what kind of criticisms do they have for that?
01:11:50.000It depends upon where they're coming from.
01:11:52.000Some people think it's not my job to teach white people how to treat us.
01:12:10.000But a lot of people, they don't see that because they would not tolerate the time to sit down and have somebody tell them some nonsense that Jews are the childs of the devil or some crazy things like that.
01:12:22.000I will sit and listen to that and I will put up with it.
01:12:25.000Because in order for me to speak my mind, I have to listen to somebody else's.
01:12:33.000So they're not willing to put in that time.
01:15:27.000She travels to every show around the country with him.
01:15:29.000And when she gets in the box, there's a pair of mannequin legs laying on the floor of the box that are wearing the same stockings and same shoes that she has on.
01:15:38.000She picks them up, shoves them out the hole.
01:15:41.000When he says, move your feet, she shakes those things.
01:15:43.000And then she brings her own legs up under her chest.
01:15:46.000So her whole body is on that half of the box.
01:16:12.000You know, I guess that would be the only way that would work.
01:16:14.000You've offered him a better perception, and that perception then becomes his reality.
01:16:20.000So don't attack somebody's reality, regardless of what it is, even if you know it to be false.
01:16:25.000Give them a better perception and allow them to resonate with it, because it's always better when somebody comes to the conclusion, I've been wrong.
01:16:32.000Maybe this is something I need to think about.
01:16:45.000And, you know, so Daryl always talks about how much he listens when he starts a dialogue and doesn't even try to, you know, push ideas at the people that he's engaging with, different extremists or whatnot.
01:17:28.000The data and the statistics show that there are more blacks in prison than white people.
01:17:33.000So that feeds what he already thinks he knows, the data, right?
01:17:38.000But he does not go to find out why does that data show that.
01:17:43.000He doesn't realize there may be an imbalance in our judicial system that sends black people to prison for longer periods of time than white people who've committed the same crime.
01:18:21.000And then he says, and black people are born with smaller brains.
01:18:26.000And the larger the brain, the more capacity for intelligence.
01:18:29.000The smaller the brain, the lower the IQ. So now I'm being called stupid.
01:18:33.000Now, he says that this is evidenced by the fact that every year the data shows that black high school students consistently score lower on their SATs than on white kids do.
01:21:36.000I've got to make up my mind, what am I going to do?
01:21:38.000So the dilemma is, do I disregard whatever color he is and believe the truth because I know it to be true and change my ideological direction?
01:21:48.000Or do I consider the color of his skin and continue living a lie?
01:21:52.000In most cases, people will follow the truth.
01:21:54.000But then there will be those who don't want to give up the power or the notoriety or whatever, and they will follow the lie.
01:22:01.000Well, the way you're doing it is brilliant because you're doing it so patiently and contrary to the way most people handle arguments.
01:22:08.000Most people handle arguments by trying to shut down the other person's argument and shit all over them instead of trying to, what you're saying, offer an alternative perspective, which is really probably the only way to get people to think about things in a different light.
01:22:21.000And, Joe, that comes from the fact that I've done a lot of travel.
01:22:25.000I've been exposed to people from all over the world.
01:22:29.000We told a story on the podcast the first time you hear about not even understanding racism until you were a child because you grew up overseas.
01:23:22.000And Mark Twain said, quote-unquote, travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our people need it sorely on these accounts.
01:23:31.000Broad, wholesome, charitable views of men and things cannot be acquired by vegetating in one little corner of the earth all one's lifetimes.
01:23:43.000And so Sam Harris actually did a study that we talk about in the paper.
01:23:47.000He did a neuroimaging study of people being exposed to political beliefs different from their own and actually looked at people's brains when they were going through this experience.
01:23:59.000And they actually talked about this thing called the backfire effect, which is sort of what you're talking about when the wall's up.
01:24:07.000And so they sort of detected that, interestingly.
01:24:10.000And I forget the exact name of the study, but it's in the footnotes.
01:24:58.000You know, Darrell, I'm just thinking while I'm listening here, like these conversations that you've had with these white supremacists and neo-Nazis, how amazing would it be if that was a podcast?
01:25:11.000No, but I'm saying, if you sat down with those people from the beginning, from first meeting them, and see that conversation play out, that would be very relatable.
01:25:39.000Some of them I think are, but if not, I can send you some.
01:25:41.000I think those videos would be a great tool for someone that's maybe trapped, but at least partially open-minded, where they have this view of things, like, maybe I'm incorrect about this.
01:25:55.000But as a podcast, that would be brilliant.
01:25:59.000That's a great idea to have someone from the jump walk in a KKK member and have this conversation where they sit down with you over hours and hours and present all these articles about crime, brain size, all this shit, and have you just tell them your perspective and see the wheels start turning.
01:26:21.000Because I think sometimes A lot of these people they're only interacting with people that think like them.
01:28:30.000I said, Charles Manson, Jeffrey Dahmer, Henry Lee Lucas, John Wayne Gacy, Ted Bundy, Albert DeSalvo, the Boston Strangler, David Berkowitz, Son of Sam, on and on.
01:29:22.000And he said it, and I didn't know the guy before I had him on, and while I was having him on, I was realizing a lot of the shit that this guy's saying...
01:29:35.000Yeah, but back in those days, I would have people on, I would just read something, they'd say, well, this is probably a conversation that's controversial, I'll talk to this guy.
01:29:44.000But some of the things he was saying, that was one of them, was that black people had this gene for violence.
01:29:48.000And I go, well, how the fuck do you explain war?
01:29:51.000My take was like most wars started by white people.
01:29:54.000Like if you looked at the amount of war that goes on in the world, worldwide, like how much of it is instigated and initiated by white people and is there a thing more violent than war?
01:30:07.000It's like literally you're telling people that don't even know people that it's their obligation to kill someone based on what land they're from or what part of the world.
01:30:17.000That's the most violent shit we know, and it's all by white people.
01:31:00.000All sorts of parts of the world where there are these military actions that we're ignoring.
01:31:05.000There's actually a chart that someone put up.
01:31:07.000It's like a graphic that shows the bombings and the people that died in Ukraine versus the people that are dying right now simultaneously due to US drone strikes and all sorts of other shit that's happening all over the world at the same time.
01:31:24.000We're concentrating on this one thing, and it's in the news, and that's part of the reason why people are concentrating on it so much.
01:31:30.000Well, I learned a long time ago when I was living overseas, if you want to learn about your own country, read a foreign newspaper.
01:31:38.000Like the Herald Tribune, the French paper, their perspective on what's going on in the U.S. Because we don't tell our own people just the same way.
01:31:46.000Russians don't tell their own people everything.
01:31:49.000I'm interested, you know, that you had that feeling that, you know, maybe you shouldn't have had that person on.
01:31:56.000I know, I know, but I'm just, I'm saying that I think that, because I'm sure that was a, I don't know who you're talking about, but I'm sure that was a productive conversation in certain ways, and I feel like there's this chilling effect that is happening.
01:32:09.000Where we're afraid to have a conversation with a murderer.
01:32:16.000Or maybe not a murderer, but that's kind of the funny thing.
01:32:19.000You could interview probably a serial killer on this show, and that would be fascinating.
01:32:25.000Oh, dude, Joe's, like, gonna become a serial killer.
01:32:28.000He just had a serial killer on his show.
01:32:30.000And, like, people are obsessed with true crime and, you know, obsessed with interviews with some of the worst humans that have ever existed.
01:32:39.000And those are considered to be extremely valuable interviews.
01:32:49.000Own your ability to do that in a way where people aren't assuming that you think or you endorse the views of people that you're talking to.
01:33:18.000What other human being has a documented result of literally hundreds of KKK and neo-Nazi people abandoning their ideology because they've had a conversation with you and literally had a change of heart, an actual change of heart?
01:34:57.000And the ones that do, what would be smarter than whether it's Google, Facebook, Twitter, whoever, to actually start doing some of this stuff and start to be more transparent?
01:35:08.000I think the amount of moderation that they would require Would be extraordinary.
01:35:13.000You can achieve it with community-centric moderate.
01:35:31.000Well, I'm saying that Twitter rolled out a product called Birdwatch.
01:35:35.000Which was a – and I don't know if it's still going on, but this was like last year.
01:35:40.000It was a community-centric moderation tool to get the – so let's separate payments from actually getting the community involved in the moderation.
01:35:48.000So communities are already heavily involved in moderation.
01:36:56.000In the confines of the campus, but the objective of higher education is to teach people how to navigate society beyond the campus and be a productive citizen, right?
01:37:08.000So you got to let people learn that, hey, you're a woman.
01:37:13.000But when you graduate and you go out there and work in the real world, you might be sexually harassed by your boss.
01:37:19.000You might not get paid as much as your male counterpart who knows less than you or whatever.
01:37:23.000Or you might not get the job because you're black or because you're whatever.
01:37:27.000This is, you know, in addition to the academic education, they need this empirical education.
01:37:33.000And those institutions that are shutting me down are not providing it.
01:37:37.000But what I was going to say also was today you got – and speaking of cancel culture, you got people banning books and banning history classes under the guise of CRT, critical race theory, things like that.
01:37:49.000You've seen the pictures of a black girl walking towards a white school building for the first time, people behind her yelling at her and all that kind of stuff, or the four black guys sitting at the Woolworths counter in Greensboro and people pouring stuff over their heads.
01:38:07.000Those white people that did this made history back then, and now it's those same people that are saying, we don't want that taught in the schools.
01:39:00.000What's neutral about police dogs attacking peaceful black marchers on the way to the courthouse to register to vote?
01:39:06.000There's nothing neutral about it, but I think that there's definitely some ideology that is attached to critical race theory that is rooted in critical theory, which is a left-leaning...
01:39:21.000There are multiple definitions of critical race theory, and nobody has really explained it.
01:39:30.000You're trying to victimize white people as the oppressors and victimize black people as the oppressed, and that's how you are, and you will never change.
01:39:40.000That's how the people who are opposed to it define it.
01:39:44.000But, you know, but that's not necessarily how some of the people who participated in the creation of it, like Kimberly Crenshaw, I can speak to all of them, define it, you know?
01:39:53.000So it needs to be, all history needs to be taught, you know, and through the lens of what happened.
01:40:45.000Education exposure is the key to advancement.
01:40:48.000Well, what we need to do is your take on the way you've had these conversations with these KKK people and these neo-Nazi people.
01:40:55.000That has to be across the board with everything.
01:40:58.000Let a person explain their position and then you come up with either a better argument or you agree with part of what they're saying or...
01:41:08.000The only way is to not silence them, to let them talk.
01:41:12.000So if people are against critical race theory for any particular reason, they should listen to the entire argument of what critical race theory entails from at least that person's perspective, and then this is what I agree with, this is what I don't agree with,
01:41:27.000and have a conversation that's Rational.
01:41:34.000They're not attacking the person with insults.
01:41:36.000They're just talking about what is correct and incorrect about everything from economics to health care to everything.
01:41:45.000These kinds of conversations are how people find out what's right and what's wrong and how people find out what resonates with them.
01:41:52.000And as soon as you shut people down, those conversations stop.
01:41:56.000And then these people go off into their own corner of the world where they are accepted, and they get in an echo chamber, and they just reinforce whatever stupid idea they had in the first place.
01:42:05.000Yeah, what you were saying about watching people change their minds, like their interviews, that is so powerful.
01:42:10.000And we're actually watching this Change Minds.
01:42:14.000Sort of challenge where we're going to be trying to, as a campaign on the site, to have people make videos and tell stories of a meaningful time that they changed their mind.
01:42:24.000Because everybody doing that more, what's a recent time you've changed your mind about something sort of meaningful?
01:43:30.000Because if a leopard cannot change his spots and a tiger cannot change his stripes, why would I think that a Klansman could change his robe and hood?
01:43:41.000But I changed my mind because those conversations did change that person.
01:43:47.000And you're right, a leopard cannot change its spots and a tiger cannot change its stripes because those two animals were born with those spots and stripes.
01:43:58.000That Klansman or Klanswoman was not born with that robe and hood.
01:44:43.000But then once it isn't, there's some people that for whatever reason never want to admit they're wrong because they think that being wrong makes them less.
01:44:53.000Yeah, to play devil's advocate with ourselves, I mean, I'm not even ideological about our model.
01:45:03.000I actually think that I'm open to seeing, you know, over the course of 10 years, like, let's actually come back in a few years and look at the data that...
01:45:15.000The information that we've gathered about the rate of deradicalization and whatnot.
01:47:03.000So we have the distinction between misinformation and disinformation.
01:47:07.000The difference is that disinformation is intentional manipulation.
01:47:13.000I think that it really depends on the context of the specific post that we're talking about.
01:47:19.000So I don't want to make a generalization about There's troll farms in the US that are doing all kinds of inauthentic content engineering for different political purposes.
01:47:29.000It doesn't matter where the continent is from.
01:48:21.000Which is starting to build this interoperable identity that you carry between social networks.
01:48:28.000So basically you're bringing your credibility, your identity, whatever you want to share, whether it's Art, you know, content, it's all tied to you and you're sort of moving around freely in a sovereign situation.
01:48:45.000I think that's where we want to go long term so that you're not locked in.
01:48:49.000As technology evolves, so should ideology.
01:49:11.000And the thing, when people don't want that, and they want people censored, what you're saying is your ideas won't hold up.
01:49:19.000Because you don't want to, if we could all have debates in real time with good ideas versus bad ideas, and everyone gets a chance to watch, it's gonna be messy.
01:49:31.000But at the end, you're gonna at least know where you stand.
01:49:37.000Because you've had both arguments played out in front of you, whether it's left versus right or whatever it is when you're talking about ideologies.
01:49:45.000You've got to watch these people have these conversations.
01:49:50.000And if you can do that, you can kind of agree with one or disagree with the other and find out where you fit in this.
01:49:57.000Or take something good from that person, something good from that person, and put them together.
01:50:19.000But there are, like, distributed systems, like IPFS that I mentioned, and Arweave, and some of these, like...
01:50:26.000Systems where it's decentralized and, you know, you don't have to pay for all of the storage, but the bandwidth is still an issue and, you know, it's a spectrum with the decentralized stuff.
01:50:38.000But, yeah, so, dude, I have this cool thing.
01:51:17.000So yeah, you plug in your computer and you can send Bitcoin to it and then you got to puncture that hole and that's what unlocks the private key.
01:51:25.000So I can give it to you and you cannot access the Bitcoin on this until that hole is punctured.
01:51:30.000And then you plug it back in and you can actually take control of the Bitcoin.
01:51:33.000What does that have to do with censorship?
01:51:34.000So, well, this is the ultimate censorship-resistant crowdfunding mechanism.
01:51:39.000This is totally uncensorable money that anyone could send crypto.
01:51:43.000Right, but we're talking about discussions, conversations.
01:51:46.000Well, yeah, and so the reason I'm bringing it up is because we are putting a full Bitcoin towards, you know...
01:52:03.000The address for this wallet is published on minds.com slash change.
01:52:07.000And so what we want to do, you know, you see all the censorship, the financial censorship happening, which is correlated to censorship of speech.
01:52:16.000Google has now suspended monetization on YouTube for all users in Russia.
01:52:41.000It's not only an online social network.
01:52:43.000And so if the address, if anyone is interested in supporting the conversations, the long-form conversations we're having with Daryl, Please, you know, send Bitcoin to this address, and we're going to put it towards that.