In this episode of the podcast, we talk about a hoax that happened to Bill's college roommate, Joey Diaz, and what it means for the future of postmodernism. We also talk about the Grieving Studies Hoax and the postmodernist hoaxer, Peter Boghossian, and how he got fooled by a bunch of other hoaxers. We hope you enjoy this episode and that it makes you think about what it's like to be the victim of a hoax! Also, we apologize for the audio quality in this episode, we re working on fixing that. We promise it'll be better in the future. Enjoy the episode and spread the word to your friends and family about this one! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. Art: Mackenzie Moore. Editor: Patrick Muldowney. Cover art by Ian Dorsch. The theme song is by Suneaters, courtesy of Lotuspool Records. Our ad music is by Build Buildings. This episode was produced by Micah Vaynerchuk. Please rate, review, and subscribe to our podcast on Apple Podcasts, and we'll be looking out for your comments and thoughts in the next episode! Thank you to our sponsor, and our patron(s). Thanks to my good friend, . and my good vibes! for sponsoring the show. Thanks for listening and supporting the show, and for all the support us on social media support us, and the support we get us out there on the road, and all the work we do on this podcast, and thanks you're amazing, and our support, and thank you for all of our support and support we do it, and so much more! we really appreciate it, we really really appreciate you. , and we really hope you guys are amazing! - Thank you, thank you, bye! and thanks, bye, bye. - Jack, Jack, Sarah, and good vibing, and bye, and much more. xoxo. Jack, Caitie, Sarah and Sarah, - - Sarah, Caitlyn, Rachel, and Joe, Kristy, and Matt, and Jack, and everyone else, and your support is much more... Love, Caitlyn and Ben, Natalie, and Mike,
00:00:17.000I'm trying to get back into handwriting.
00:00:18.000For people who don't know, Bill is the CEO and co-founder of Minds.com, and we've been going back and forth through email, and you got hoaxed by some dude who said he was Joey Diaz.
00:01:16.000Which is not good, because I actually majored in English.
00:01:20.000Yeah, you definitely lose your ability to write words.
00:01:22.000It's funny, I tried writing in, for whatever reason, I write mostly in all caps, because I mostly just write notes, but I tried writing with lowercase letters, and then I tried writing in cursive, and my cursive is like, it's almost like I have to relearn it.
00:01:37.000Yeah, I was finding just trailing off at the end of certain words, but I blend it all together.
00:03:29.000They submitted a bunch of fake studies to these journals and not only got reviewed, but got lauded and praised for their academic scholarship.
00:05:11.000Okay, let's say that you find some Chinese bot that's purposely disseminating incorrect and negative information about maybe a potential presidential candidate.
00:05:57.000It's called the Manila Principles, which the Electronic Frontier Foundation wrote with a bunch of other internet freedom groups, which is talking about how digital intermediaries shouldn't be making these subjective decisions about what's getting taken down and should require a court order.
00:06:12.000Now, with a DNS provider or something less content-focused...
00:06:19.000Explain DNS to people who don't know what you're talking about.
00:07:04.000There was a guy who was the wrong skin guy who was saying he was born in the wrong color skin, that he's transracial, and he would make these ridiculous arguments about it.
00:07:15.000Apparently he's a comic from the show.
00:07:16.000I was just thinking, this sort of started, I don't know the moment they had to do it, but let's say four years ago, a lot of those troll accounts had to sort of say, we're not...
00:08:31.000This strange new ground that we're covering.
00:08:37.000We've been discussing this ad nauseum on the podcast lately that essentially we've been dealing with 20 years of this.
00:08:43.000And in those 20 years, it's changed radically.
00:08:47.000What it is, it's become something completely different.
00:08:51.000It's become something that changes public opinion on things overnight.
00:08:55.000It's become something where you can distribute information from person to person about some huge international news event.
00:09:07.000You can get all of your information from Twitter, whether it's what happened in Venezuela or anywhere there's something in the world.
00:09:14.000People are turning to social media almost before they're turned anywhere.
00:09:19.000When I hear about something, I almost always, before I even Google it, I almost always go to Twitter and check Twitter and see what's going on.
00:11:21.000Their searches are not going to be private.
00:11:23.000Like, say if you search, like, you're thinking about buying a Jeep, and you search Jeeps, you look at, you know, 2019 Jeep, and then all of a sudden all your Google ads are about Jeeps.
00:11:47.000Yeah, and so there's all different layers of like what we use with your browsers, your apps, your operating system, your food, your, you know, government, your energy,
00:12:07.000It has code that's associated with it.
00:12:11.000And when you open up your computer, when you sign into a browser, when you open up an app, you are empowering that app.
00:12:18.000That's how the apps of the world become huge, monstrous corporations, is because we all use them every day.
00:12:23.000So if you switch from Mac OS to GNU Linux or Debian or Ubuntu, if you use Brave or Firefox, if you DuckDuckGo is actually proprietary, which is annoying, but they are very privacy-focused.
00:13:26.000And I just think it's important for people to use things that are transparent to them and respecting our freedom.
00:13:34.000Yeah, I think one of the problems with these giant companies is that once they become big, you kind of use them as a default, and it's very difficult to get people to communicate with you off of them.
00:13:43.000It's hard to say, hey man, I'm launching this new social media app.
00:13:48.000I would imagine you could speak to this.
00:13:50.000I'm launching this new social media app, and I want you to join it.
00:13:54.000People are like, but I'm already on fucking Facebook.
00:14:53.000Facebook and all the congressional hearings and the inner workings of it all.
00:15:01.000The fact that it profits off of outrage, so it wants people to argue.
00:15:09.000The AI, the computer learning, specifically wants people to have contentious debates about things because that keeps their eyes focused on the website.
00:15:20.000And if your eyes are focused on Facebook, then those Facebook ads are very valuable.
00:16:26.000Facebook suspended in the now tweets page.
00:16:32.000At the behest of CNN and the U.S. government-funded think tanks, it says we had almost 4 million subscribers, did not violate Facebook rules, were given no warning, and Facebook isn't responding to us.
00:16:45.000So yeah, what actually started this off?
00:18:55.000He seems like he's too rich, like he fucked up.
00:18:58.000Like he's there sipping water like a robot, trying to figure out what the fuck he's doing with his life.
00:19:02.000I think that they're scared because they know they've betrayed everybody, and so it's hard to get them to speak.
00:19:10.000You know, it's interesting with Dorsey here, because I give him credit for speaking, but But the fact is that he's not answering the questions.
00:19:21.000Well, he's bringing somebody else in to answer the questions in the next go-round.
00:19:24.000And so that should be very interesting.
00:19:26.000And you think he actually didn't know the answer to those questions?
00:19:27.000I think he probably doesn't know all the specifics because he's a CEO of not one but two different corporations.
00:19:38.000True, but I think that when we look at the policy that exists on these networks, he is in control of the policy to a large degree.
00:19:47.000There's a board, there's a decision-making process, but he has a large voice.
00:19:51.000Okay, I don't know how large his voice is.
00:19:53.000I assume that's probably true, but one of the things we did detail on the last podcast with Tim Pool was how he wasn't the CEO for quite a long time.
00:20:00.000Yeah, he got fired and then rehired at some point.
00:20:02.000Yeah, so obviously there's some contention, there's some issues, and there's a lot of money involved in these things, and I think that plays a giant part in how they decide to make decisions.
00:20:39.000I mean, if the controversial videos are about how Jews are evil, and you have this video about Jews being evil, and then you're like, buy Razer computers!
00:20:48.000Right, but do you think that people actually...
00:20:51.000I can understand not wanting to support certain types of content.
00:20:55.000And maybe advertisers feel like they're supporting that content by advertising next to it.
00:21:01.000But I also don't think that people, when they're watching a controversial video on the internet, say, oh my gosh, you know, this advertiser is completely out of line for being next to this controversial thing.
00:21:14.000I don't think that's a healthy direction to move.
00:22:27.000And it's really damaging for brands when it gets demonetized right away because it's that initial time period that generates the most revenue.
00:22:34.000So when you have to go back and do it...
00:24:24.000It's like, when you're a public forum on that scale...
00:24:28.000The community just has a right to know what the algorithms are doing.
00:24:31.000So you think that they're not sharing their software because their software is encoded and designed to spy on you and extract information and sell that information?
00:26:05.000Well the algorithms, you're only reaching 5% of your own followers organically on Facebook now.
00:26:11.000And they're starting to change the chronological feed on Instagram too.
00:26:15.000And they know that this causes depression and they're still doing it because they know that they think they're better at showing you what you want to see than you are.
00:26:29.000What do you mean by they know that this causes depression?
00:26:31.000They've done studies about mental health in relation to...
00:26:36.000Actually, Facebook got exposed like five years ago for doing a secret study on...
00:26:41.000On like a few million users where they were injecting both positive and negative content into the newsfeed and they proved that they could affect people's moods.
00:28:28.000I think that the core purpose of a social network is to subscribe to someone and see their stuff.
00:28:33.000And when people subscribe to you, they see your stuff.
00:28:36.000So when you spend years building up a following on social media, and say you earn 100,000 followers or something, and then suddenly the network says, nah, your friends can't see that anymore.
00:31:25.000How many different companies are subscribing to that?
00:31:29.000It seems like all the big ones we're saying are curating and moving things around and all the big ones have an algorithm that's designed to keep you on board, right?
00:31:47.000But it's just taking away people's reach when they have worked years and years to achieve it, it's not okay.
00:31:54.000Do you think that this is this marriage between something that is this social media network that's designed to allow people to communicate with each other and then commerce, like this business, like how do we maximize this business?
00:32:07.000How do we get more profit out of this business?
00:32:08.000How do we get these people to engage more?
00:32:11.000And then they start monkeying with the code and screwing with what you see and what you don't see.
00:34:00.000But the weird thing is that Even though we're a fraction of the size, especially smaller creators who come get better reach on minds than they do on Facebook and Twitter because we have this reward and incentive system sort of like gamified where you earn reach and you earn more of a voice for contributing.
00:34:21.000So like you could have an account on Twitter for 10 years and post thousands and thousands of tweets and you never hit that viral nerve and you just never really get much exposure.
00:34:32.000So we're trying to help people be heard.
00:34:35.000And so you'll find a small creator who on other networks has no followers, have thousands and thousands of followers on Minds.
00:34:42.000And what do you think you would like to do with Minds in the future that you haven't been able to do yet?
00:34:49.000Engineer the control out of ourselves so that we aren't even in a position to really take people's stuff down or What if someone posts your house and your information,
00:35:38.000You know, 25 years ago, would you have thought you'd be sharing, you know, 20% of your life live streaming to, you know, millions of people?
00:35:47.000Like, our lives are becoming more transparent just inevitably.
00:36:56.000A lot of conservatives on Twitter are finding that.
00:36:59.000Sam Harris actually just sent me an article.
00:37:02.000It was detailing the bias against conservatives on Twitter that they've actually done, you know, like some real studying it, and it's pretty demonstrable.
00:38:41.000So what you're saying is that these algorithms that they use in order to maximize their revenue and give people things that they like but actually takes away from things being posted chronologically, keeps certain things from being seen by as many people, so it keeps them from being as viral,
00:38:58.000so it keeps the whole thing from being organic.
00:39:18.000I mean, it's one of the reasons why I wanted to have you on.
00:39:20.000I wanted to find out where these upstarts or these new people that are coming into the game, like mine's, like where you're coming into the game from and what is your position on what's wrong with the current state of affairs.
00:39:33.000Yeah, and look, there is messed up stuff on social media.
00:39:39.000We'll get pigeonholed into being like, oh, you support all of this crazy stuff.
00:39:44.000First of all, most of the users online are artists, musicians, filmmakers, activists, journalists, just trying to get their content out there.
00:39:51.000There's a very tiny minority of actually crazy content.
00:39:56.000When you say crazy content, what do you mean?
00:41:48.000No, it's really good because he's such a nice guy.
00:41:51.000He's so easy to get along with that they let the guard down around him.
00:41:57.000You get to see these people kind of confused that they like this guy.
00:42:03.000That's why I think initiating human contact via the social networks, that's really important.
00:42:11.000But, to play devil's advocate, it's one of the worst ways for people to express themselves in a way where you consider other human beings' experiences and feelings and the way they're going to receive what you're saying because there's no social cues, you're not interacting with them, you're not looking at them in the eyes.
00:42:27.000It's one of the weirder forms of communication between human beings and one that I would argue we have not really necessarily successfully navigated it yet.
00:42:50.000She was with the Westboro Baptist Church.
00:42:55.000You know, the famous one that protests those soldiers' funerals and anything gay.
00:43:02.000They're like ruthlessly, viciously fundamental Christians.
00:43:08.000They do a lot of protesting at funerals and do a lot of stuff to try to get...
00:43:11.000She was with them for the longest time and then got on Twitter.
00:43:17.000And through communicating on Twitter, and when you meet her, you would never believe it in a million years that she was ever this fundamentalist and that she was ever some mean person sending hateful messages to people because their son was gay or whatever it was.
00:43:38.000She does a podcast now and gives TED Talks and speaks about radicalization and about how she was kind of indoctrinated and grew up in this family.
00:43:46.000And her grandfather, Fred Phelps, was this, you know, it's like, it's a fucking mean guy.
00:43:52.000Like a really mean, he's the God Hates Fags guy.
00:43:55.000You know, they would have those signs that they would hold up at soldiers' funerals.
00:44:00.000I mean, it's like really inflammatory stuff.
00:44:02.000But through Twitter, through her communicating with people on Twitter, specifically her now husband, like, he cured her, like, just with rational discourse and communication, and she was open to it.
00:44:42.000He was a white supremacist, KKK member, guy, who's been on Sam Harris' podcast, he's also done some TED Talks, who now speaks out against it and talks about how he's indoctrinated and talks about how lost he was and then he was brought into this ideology.
00:44:59.000There's many people like that all over the world.
00:46:23.000Like, maybe you are, like, a hyper-radical lefty, and maybe Jamie's points of view and yours are just never going to line up, so you're like, fuck him, he's banned for life, which a lot of people have been banned for life.
00:46:36.000And when you look at some of the infractions they've been banned for, they're like, boy, I don't know about that one.
00:47:58.000I think, though, that your ideology is going to be, your point of view and perspective is going to be very different than maybe someone who's like a radical Marxist.
00:48:09.000You know, shouldn't they be allowed to post on the site too?
00:48:11.000Someone who's like an extreme socialist.
00:50:15.000I got, like, octopuses banging chicks in every hole, and they're choking on it, and they've got one in their ass and one in their vagina, and it's all, like, very liquidy.
00:50:26.000You know, there's a lot of splattering going on.
00:50:28.000You're like, what the fuck is this, and is that okay?
00:50:32.000I mean, if it was a person getting fucked left, right, and center by an octopus, he'd be like, yeah, I think we've crossed some lines here.
00:51:32.000To be honest, Time Magazine just did a really interesting piece about a statue that got banned from Facebook.
00:51:37.000It was a naked ancient statue that has a nipple.
00:51:41.000Like, I'm sorry, that's not realistic.
00:51:45.000That's not helping society, taking down a naked statue.
00:51:50.000Well, we were talking about the other day, during the Super Bowl, that Adam Levine had his shirt off, and Brian Redman was like, hey, wasn't that what Janet Jackson got in trouble for?
00:52:01.000Why is it okay if Adam Levine shows his nipples, and Janet Jackson's nipples are offensive because they're sexualized, because she's a woman?
00:52:52.000And then when things come up, like one of the things that Instagram has been doing is like they say, I follow a lot of hunters and Instagram has things where they say, warning, this is sensitive content.
00:53:04.000Nature is Metal gets popped on that a lot too because Nature is Metal is an Instagram site that's all like these crazy images and videos of animals eating other animals and attacking other animals.
00:53:16.000And sometimes, some of them, they just decide, this one's too fucked up.
00:56:51.000You know, I sent Eddie Bravo this thing from The Guardian about the upsurge in people that believe in the flat earth and all of it because of YouTube videos and that apparently now YouTube is, they want to censor those.
00:57:07.000They want to, they feel like Flat Earth videos and I think another one, check this if I'm wrong about this, but I think they also want to lean on those anti-vaccination videos.
00:58:31.000But I think freedom of information sort of transcends a lot of these little debates.
00:58:37.000So if there was more freedom of information, so we actually knew everything the government knew about all of the different conspiracies and black projects, the black budget.
00:58:55.000The reality is that we don't know what's happening, and there is lots of secret stuff.
00:58:59.000The problem with that, though, is then you're dealing with foreign governments that are way better at keeping secrets than we are, and if they have access to our secrets.
00:59:07.000One of the things that's been kind of disturbing is seeing the actual influence that these Russian troll farms have had on not just our political process, But sowing seeds of dissent amongst people and starting conflict amongst people and how people are buying into it.
00:59:25.000You know, like this podcast I've been talking about a lot with Sam Harris and Renee DiResta, that's her name, right?
00:59:31.000Where they talked about how these Russian troll farms set up a conflict by having a pro-Muslim rally across the street from a pro-Texas Pride rally.
00:59:43.000And they just set it all up and had it there and then a skirmish broke out.
00:59:47.000Because these people are across the street from each other.
00:59:49.000And that they do this with – they were having these African-American groups that were saying anyone but Hillary, and they were really trying to get people to vote for Jill Stein, really trying to get people to even consider Trump anyone but Hillary.
01:00:03.000And then they were also having ones that were against them.
01:00:19.000When you're in a position where you have a fairly small network, but it's influential, right?
01:00:25.000And then so you're watching Zuckerberg and the Facebook shit on TV, and they're talking to these congresspeople and senators, and they're talking to all these politicians about what's going on and how to stop it and what they're trying to do, and you feel like, oh God, this is an arena that I'm getting into.
01:01:59.000What's weird for people is that people are being hired to make these memes, and these memes may not have anything to do with their own personal ideology.
01:02:07.000They might just decide, hey, I'm going to collect this check And they make, apparently according to Renee in this podcast she did with Sam Harris, they make really hilarious memes.
01:02:52.000I think that transparency and understanding what's going on with different accounts and if it's the real person, that's all important stuff.
01:03:30.000Every time there's a big scandal, every time, whether it's data manipulation or our first big growth spurt was during the Snowden days when he released all the information.
01:03:44.000People are really upset with what's happening.
01:07:48.000But just the principle that the experts could, because they will.
01:07:52.000You know, there's all kinds of think tanks and whatnot that would love to dive into the source code to understand how these companies were actually behaving.
01:08:00.000So, you know, waving the privacy flag without being open source or...
01:08:07.000This is getting a little bit into the weeds, but a lot of this comes down to licensing of content or code.
01:08:13.000So the license that we use for our code is called the general public license, the AGPL v3, which means that anyone can take our code and do whatever they want with it.
01:11:54.000It was from six months prior, and that other person's podcast was on YouTube.
01:11:58.000It had nothing to do with Patreon, and they had specifically said that they were not going to act on content that was outside of their network.
01:12:06.000They were only going to react to things that were on Patreon.
01:12:58.000Don't go, no man, that ain't right, bro.
01:13:01.000Because that natural instinct to argue and to claim some sort of a personal identity with your ideas, that's part of the problem that we have.
01:13:12.000I think it's a main conflict issue with social media.
01:15:58.000Well, you know, they probably don't want to hang out with you anyway, let's be honest.
01:16:00.000But what you're doing by back and forth, and I know people who do engage in it, and sometimes they have these anxiety moments where they don't sleep for days because they're involved in these Twitter feuds.
01:16:17.000I mean, I know people that have done this, where they've gotten involved in Twitter feuds, and they'll wake up at 3 o'clock in the morning, they check their Twitter feed, and like, oh, Christ, man.
01:16:24.000Like, you gotta go on a yoga retreat or something.
01:16:39.000And it's like, okay, I'm not going to spend my time doing it that way.
01:16:43.000Some people want to spend their time doing it that way.
01:16:45.000And if there's cool mechanisms for the most voted content to be seen, I mean, okay, that's interesting to check out sometimes to look at feedback.
01:18:02.000And my concern is that what we're experiencing right now in this flat form of two-dimensional text is something that is very overwhelming to a lot of people's time.
01:18:13.000I mean, you're looking at some kids that are online, social media, eight, ten hours a day just staring at their phones.
01:18:19.000I'm extremely concerned, and I have some jokes about it in my act, about the next wave, because I think that we're overwhelmed by this incredibly attractive medium where we're attracted to our phones, we're attracted to this style of engaging in information and receiving information and passing information and online arguments and debates and looking at pictures and this constant stream,
01:18:45.000which, you know, Just looking at your phone, it's not that thrilling.
01:18:51.000It's just like, hmm, it's not that thrilling.
01:18:53.000It's like, okay, yeah, but it's still getting you all day long.
01:18:59.000When my concern is if something really crazy does start to happen.
01:19:03.000When you really can have experiences that are hyper-normal, like that are more powerful than anything you can experience in this regular carbon-based physical touch-and-feel world.
01:19:15.000And once we start experiencing augmented reality, the integration between humans and technology, and then the ability to share augmented reality.
01:19:27.000If you were at work and you have these fucking goggles on and your girlfriend is at work on the other side of town and you guys both have these similar video pets that are with you and dancing around and providing you with fucking advertisements and giving you things,
01:19:44.000there's next levels to this stuff that I'm trying to see the future, but I'm too fucking stupid and I don't really know anything about technology, but I know that they're going to get deeper into our lives.
01:19:57.000I know that these technologies, not they like the government, but these technologies, they're going to get deeper into your life.
01:20:03.000And that they got you by the balls and the clit with a fucking phone.
01:24:22.000I was an early adopter and it was like clunky and shitty and then I would go to my iPhone and I was like, oh my god, this is so much better.
01:24:29.000What iPhone is great with is integration with like Apple TV, integration with a laptop, but I also have a Windows laptop that I use a lot.
01:24:51.000It makes it easier for you to recognize where the keys are.
01:24:55.000And Apple has decided to go so far towards design and just for aesthetic beauty that they've ruined the tactile feedback of their keyboards.
01:25:06.000Do you remember that, though, when the old smartphones, they still had the keyboard?
01:25:11.000I thought I would never leave that because it was tactile, but then I ultimately left.
01:25:16.000That's true, but that's a different experience.
01:25:20.000I can do that with my thumbs and I kind of know where everything is and I'm not writing a novel.
01:25:24.000You know, when I'm writing material or essays or something like that, I need a fucking keyboard.
01:25:29.000You don't think that the holographic screen that's just here, you don't think if it just like autocorrects everything you do and you can just like...
01:27:14.000But this idea that, I don't want to give too much away, but, you know, he's acting as if they are going to be the moral authority about the types of content that can exist on the App Store.
01:28:45.000This mandate moves us to speak up for immigrants and for those who seek opportunity in the United States.
01:28:54.000We do it not only because their individual dignity, creativity, and ingenuity have the power to make this country an even better place, but because our own humanity commands us to welcome those who need welcome.
01:29:14.000It moves us to speak up for the LGBTQ community, for those whose differences can make them a target for violence and scorn.
01:29:24.000We do so not only because these unique and uncommon perspectives can open our eyes to new ways of thinking, but because our own dignity moves us to see the dignity in others.
01:29:40.000Perhaps most importantly, it drives us not to be bystanders as hate tries to make its headquarters in the digital world.
01:29:52.000At Apple, we believe that technology needs to have a clear point of view on this challenge.
01:29:59.000There is no time to get tied up in knots.
01:30:03.000That's why we only have one message for those who seek to push hate, division, and violence.
01:30:52.000Okay, I agree that you probably shouldn't put white supremacy music on, but there's a lot of really violent stuff that you can get on iTunes, right?
01:31:01.000I mean, if you go back to the old NWA albums, that's available, right?
01:31:27.000Is it that they're making the distinction between something that's fiction, that although it may be disturbing, you understand that this is a movie and this is something someone wrote versus someone...
01:32:55.000Well, not only that, there's division in LBGT and Q. There's a big issue right now with Martina Navratilova that was going on about her discussing the reality of trans women competing against biological women and that she opposes it and she thinks there's some fundamental advantages which is leading to a lot of weightlifting world records being broken by trans women and she's like,
01:33:38.000I go to this restaurant in Bridgeport, Connecticut called Bloodroot, which is like sort of an old-school feminist-like vegetarian vegan spot.
01:34:55.000Well, there's always going to be differing opinions, and especially when you have something like...
01:35:02.000Trans women competing against biological women and you know you have someone like Martina Davratilova that made her her life's work and her career competing as a biological woman.
01:35:15.000She's gonna have some opposition to that and then the idea that everyone's supposed to be lumped in together with some mandate that no one is really openly discussed you're supposed to agree and it fluctuates and moves like the tide you know like what is and is and moves like the tide It just changes.
01:35:32.000It's like this court of public opinion.
01:35:34.000It's constantly rendering new verdicts.
01:36:04.000There's a lot of blowback, and believe me, there's a lot of debate and discussion, but also, believe me, when someone does do some politically incorrect, really good stand-up, people go fucking bonkers.
01:36:22.000Oh, yeah, no, it's incredible material, but I'm just saying, for comics that are running into issues with getting banned or whatnot, I mean...
01:36:33.000Well, who's running into issues with getting banned?
01:38:44.000One of his company's actions was that he believes that the ability to communicate is a fundamental right, like the ability to get electricity.
01:38:50.000Like, if you're in the KKK, you can still order electricity.
01:38:54.000So, should you be able to just distribute information?
01:38:57.000If people say no, then you have to say, okay, well, who's to decide what can and cannot be distributed, and then who's to decide if they can go somewhere else?
01:39:06.000And then what happens if you tell a person they can't go anywhere?
01:39:11.000We're looking at more of a community moderation structure so that we've even been considering like a juror system so that if we make a bad decision and someone appeals it, then the community can potentially...
01:39:33.000I think that's where it's going to go.
01:39:52.000In the GDPR, the European privacy laws have this whole idea of the right to be forgotten online, which is very difficult because deleting things from any database, especially a blockchain, is not easy.
01:40:06.000So the idea that you can go on the internet, do crazy shit, and then just have it taken away, it's a paradox because privacy means control, but It doesn't jive with the way that technology works to just be able to delete things.
01:40:45.000I think that what we're dealing with now is like you have to interface with it, right?
01:40:51.000You have to interface with your computer, you have to interface with your phone to access all this stuff.
01:40:57.000My real concern is that that's just a temporary step.
01:41:01.000And that we're going to just consistently and constantly be interfaced with all of each other.
01:41:07.000You know, Elon brought something up when he was on the podcast called Neural Link.
01:41:12.000And he didn't want to fully describe it because he said he couldn't, but he said it's going to be live within a matter of X amount of months.
01:41:18.000And he was talking about it increasing the bandwidth between human beings and information in a radical way that's going to change society.
01:42:12.000I mean, our fucking TriCaster crashes every other podcast.
01:42:15.000Yeah, whether it's open source or free or not makes no difference to whether it can fuck up your brain.
01:42:20.000Right, what if somebody puts that shit in and then, for whatever reason, they have a blown fuse and they stomp on the gas and drive right into a tree?
01:42:26.000It depends on the level of risk you're willing to take.
01:42:29.000I mean, you see some of those videos, like, I've cried at those videos where, like, the woman, like, hears for the first time, you're like, oh, Yeah, yeah, yeah.
01:42:35.000And people seeing color for the first time, putting on certain glasses that allow them to see color.
01:44:38.000If one of the big companies, Google, Facebook, had just been free and open source, we would have spent the last seven years building on top of them.
01:46:06.000Well, we put things in people's feeds that they want to see.
01:46:09.000We put things that people want to debate about and argue about and political things, all sorts of different things that excite them and get them to be engaged with the platform.
01:46:28.000Well, they grow by maximizing their profits and by maximizing the amount of eyes that get to their advertising so they get more clicks and more people get engaged.
01:49:33.000Okay, again, to play devil's advocate, the vast amount of users are not using those platforms.
01:49:38.000The vast amount of users are using these controlled platforms like Facebook and Instagram and Twitter.
01:49:44.000Like, if you're talking about, I'm just guessing, but if you're talking about the gross number of human beings that are interacting with each other on social media, they're mostly uncontrolled networks.
01:49:56.000You're saying that this is not going to last.
01:49:59.000But there's no evidence that it isn't going to last.
01:50:52.000But once we have functionally competitive products that you wouldn't even know the difference and there's enough people there, then it's basically the decision of, you know, am I going to choose the one that respects my privacy and freedom or the one that doesn't?
01:51:08.000And people are – kids don't like Facebook.
01:52:25.000It's just hypocritical to the maximum.
01:52:29.000I think it's partly because it's a giant business.
01:52:32.000And I think when you have an obligation to your shareholders and to maximize profits...
01:52:37.000And when you're trying to maximize profits, too, and there's this universal growth model where every year it just has to get a little bit bigger, otherwise you're fucking up as a CEO. You don't have to experience that with Mines.
01:54:44.000Occasionally, if I'm bored, like I'm in the dentist's office, you have 10 minutes, all right, let's see what the fuck's going on in the news.
01:54:58.000It's so taxing, and it's so involved, and so many people are doing it.
01:55:04.000I mean, I went to a restaurant the other day, and I was looking around, and fucking everyone was sitting at a table looking at their phone.
01:55:51.000You know, the guys who run Joe Beef in Montreal, it's this amazing restaurant, Fred and Dave, and they were talking about it, that when they go to dinner, they shut their phone off.
01:56:12.000It's a similar thing to podcasting, in a way, in that one of the good benefits of podcasting is that for three hours or two hours, whatever the fuck you're doing, you're going to sit down, and you're just going to engage with the person.
01:56:35.000And that is one of the rare moments in life where you get to talk to someone for several hours.
01:56:39.000And over the last, you know, nine years that I've been doing this podcast, it's benefited me tremendously just in having real conversations with people.
01:56:51.000We're just sitting across from somebody for hours just talking to them.
01:56:55.000Getting better at understanding how people think, getting better at understanding how I think, getting way better at communicating and knowing when to talk and when not to talk and what questions to ask and try to understand the thought process that another person has.
01:57:10.000And you walk out of that with some lessons, like real, legit, tangible lessons.
01:57:15.000Those fucking don't happen when you're staring at your phone while you're talking to people.
01:57:46.000We need everyone to have the ability to share and so that you can check because maybe you're more likely to get the reality of what's going on in the world from your newsfeed than the big companies.
02:00:52.000Well, that was what the Sober October thing kind of turned out to be about.
02:00:56.000And there's a lot of lessons in learning that, too.
02:00:59.000You know, you learn lessons about your reliance on either substances or things.
02:01:05.000And one of the things that I learned from the Sober October Challenge, the last one, was that when you engage in really rigorous physical activity six and seven days a week, you don't give a fuck.
02:01:28.000It's really amazing because I think a lot of personal anxiety that people carry around with them is a physical energy that's not being expressed because I think the body has certain demands and certain potential and in order to have this certain potential like your potential for athletic output You have to have this energy source,
02:02:22.000You burn off 2,000 calories and you fucking run for five miles and you do kettlebells and chin-ups and fucking hit the bag for five rounds.
02:03:23.000You would shame them into doing it, or you would somehow or another make it seem like they would advance in the company more if they played along.
02:03:54.000But for whatever reason, that's their choice.
02:03:57.000It should be your choice to go out like Christopher Hitchens and just fucking drink every day and smoke cigarettes and one day you get cancer.
02:04:18.000And most of the madness that we see in brilliant artists, it's very possible that that madness would not be expressed if they had their shit together.
02:04:28.000There was something that Sam Harris was saying the other day, On your show just about the free will stuff.
02:04:35.000And I think that connects to this information theory kind of thing.
02:04:39.000So if we're just sort of a conglomerate of these actions and we're like flowing the actions through our body in unique ways...
02:04:52.000I mean, do you accept his theory on free will?
02:05:48.000Larry Lessig, who was on here the other day, you guys didn't even talk about this, but he basically is one of the founders of Creative Commons and this whole licensing structure for content.
02:05:56.000Like what we're saying right now, this is going to be licensed.
02:07:35.000Well, you are certainly if you put in the work.
02:07:37.000Like, let's say you decide to write a book.
02:07:39.000I mean, you put hundreds and hundreds of hours into this book and edit this book and then you release the book and someone says, no, you didn't create that.
02:07:49.000You're a product of determination and I'm going to just steal your book.
02:08:34.000I... What if somebody makes all the money off of your book because they have a better platform to sell your book and they don't give it to you at all and you wrote the book.
02:10:05.000Well, it is complex if you're saying that all human beings, essentially, all of your actions have been determined by a lot of factors that are outside of your control.
02:10:18.000Whether it's genetics, again, life experience, education, all the different factors.
02:11:50.000Because Stephen King had to spend countless hours in front of his laptop trying to go over each and every sentence and each and every paragraph and suck you in and rope you in and all this work.
02:13:28.000The complicated issue of who you are and why you are who you are and who you are at this moment versus who you are a decade ago or two decades ago.
02:14:14.000And while he was talking to me, I'm like, is that even really me?
02:14:18.000Like, is he even really talking about me?
02:14:20.000Because I don't have any connection to the stuff that he's saying.
02:14:24.000And I understand that he has this vague, distant, ghost-like memory in his mind of some slide images that he's pieced together that he recognizes as a past interaction.
02:14:40.000It's super strange too, like in, you know, 50 or 20, 45, whatever.
02:14:45.000You know, if your body can be replaced one piece at a time, as time goes on, then your body literally, you could survive, but your body is going to be like almost completely different.
02:15:03.000Was it Graham Hancock that used that analogy?
02:15:06.000Somebody used this analogy of certain boats that are like really ancient boats that are on display and every single piece of them from the original boat has been replaced because they rotted away.
02:15:15.000And you're like, okay, what am I looking at?
02:17:25.000Do you want to keep your legs or do you want to get these legs that allow you to jump over a building?
02:17:30.000I'm curious if there's really like superhuman projects that are going on where people actually can have these abilities.
02:17:40.000We know that with classified information, it's just we know that there's stuff we don't know that are extraordinary projects.
02:17:51.000So, you know, this being in the future, I feel like there's a disconnect between the state of technology on the planet Earth right now with, like, what the public has access to, with what the, you know, black projects have access to.
02:18:06.000And that is really not cool because it's not fair for humanity to not understand what is going on.
02:18:18.000I think that's true, but I also think that most of the state-of-the-art stuff is peer-reviewed, right?
02:18:26.000I mean, there's so many different people working on these different technologies, like CERN. They're working on the Large Hadron Collider or anything else.
02:18:33.000There's so many different people working on it.
02:18:36.000The people that are at the forefront of the technology, unless they're all gobbled up by the dark government, You know, the people at the head of the line kind of understand where the technology is at currently.
02:18:46.000For sure, for you and I, we don't know what the fuck's going on.
02:19:58.000I really appreciate your perspective, and I really appreciate your point of view, and I really appreciate your ethics and what you're working towards with minds, and that's one of the reasons why I wanted to talk to you.
02:20:09.000And as much as I fuck around and play devil's advocate, I do that to try to get to, you know, how you're thinking and whether or not you've had these arguments in your own mind.
02:20:18.000But I think, ultimately, I've said this before, and I don't know if it makes sense, because again, I'm not that smart.
02:20:26.000I really wonder if there's bottlenecks for progress that we're going to run into.
02:20:34.000And I think, ultimately, information is one of the big ones.
02:20:39.000And information also, in a lot of ways, is money.
02:21:10.000And it's kind of like information on a database.
02:21:13.000And what if we get to a certain point in time, and I sort of feel like in this weird, vague, abstract way, we're moving towards this.
02:21:22.000It's one of the things that when I really step back and wonder about this trend towards socialism and social democratic thinking, I wonder what that is.
02:21:30.000And I honestly think that we're moving towards this idea that, hey, we've got a lot of fucking problems that could be cured if you move some of that money around.
02:21:41.000But should you be able to move some of that money around?
02:21:44.000And what happens if that money becomes something different?
02:21:49.000What if people start developing social currency instead of financial currency?
02:21:56.000What if your ability to do things was based on how much you actually put in?
02:22:01.000We assume that the way we do things now, where if you want to buy a car, you have to have $35,000.
02:22:07.000That's how much a Mustang costs, and you got to bring it to the bank, and this and that, and you can prove a loan.
02:22:12.000But what if we get to a time in the future where it's not these pieces of paper that give you material objects, but rather your own actions and deeds?
02:22:22.000Provide you with social currency that allows you to go on vacations, or allows you to eat at restaurants, or allows you to do things, and there's this running tally.
02:22:30.000That's not outside of the realm of possibility.
02:22:33.000No, I think reward systems within everything that we're using are gonna rise up.
02:22:39.000I mean, that's what we're already kind of doing.
02:22:42.000I mean, we reward tokens for activity.
02:22:46.000We're gonna see But what I'm saying is if we're doing it in – if it's a social currency and that your own personal behavior allows you to access more freedoms or more goods or more things,
02:23:04.000Positive behavior and community-based behavior because that would be the only way to advance.
02:23:10.000I mean, obviously this is a long time down the line, but when the first caveman, you know, traded the first fucking shiny rock for the first spearhead, you know, whatever it was that they did that started this whole inevitable trend towards money, This is not something that has to be this way forever.
02:23:27.000And I wonder, when we're looking at the distribution of information, which is arguably, not arguably, it's never been like what we have today.
02:23:37.000There's never been a time in human history where everyone had so much access to information that you used to have to pay for.
02:23:53.000And this is a whole different way of interfacing with information.
02:23:57.000I think this is going to affect higher learning institutes.
02:24:00.000I think it's going to affect a lot of different things.
02:24:01.000But I wonder if this all can be applied ultimately someday, maybe not in our generation, but someday to money, that people start using social currency.
02:24:13.000And that social currency is going to be almost like we have some sort of a database of social currency in this country.
02:24:22.000As long as the government can be running on open systems, I think the reason we struggle with trusting the government to distribute wealth is because it's so inefficient.
02:24:41.000I mean, at the end of the day, that's a giant problem, period.
02:24:44.000If the people that are deciding what we can and can't do with information are also corrupt, which, I mean, there's laws that allow them to be corrupt, but it doesn't mean that they're not corrupt, right?
02:24:57.000I feel like every politician, the only politicians that I would support at this point, I want to be pulling us in a direction that is making their own position irrelevant.
02:25:09.000Basically, building open, secure voting systems that allow the planet or the country to decide and vote on what we're doing.
02:25:21.000I mean, you know, I just think that we need more accurate representation of the consciousness of the communities.
02:25:32.000And it shouldn't just be these singular people deciding for everybody.
02:25:40.000And by the time they get in there, they're so compromised by the special interest groups that are helping them out and all the different people that are contributing to their campaign fund.
02:25:49.000Do you see anybody like that on the horizon?
02:25:53.000I think that there are – not specifically right now.
02:25:57.000I don't see anyone talking about open systems and secure voting and completely changing the way that we're making decisions.
02:26:10.000But I think that's probably just because they don't know about it.
02:26:12.000I think there would be a lot of politicians who would be okay with that.