explicitClick to confirm you are 18+

Power to the People - The Minds Jury System

MindsMay 15, 2019, 7:55:56 PM

Almost every app in the world is failing at moderation. Some over-moderate and others don’t provide sufficient tools to hide or report what you don’t want to see. Some have unleashed dysfunctional AI and machine-learning algorithms that mistakenly take down innocent content. Legacy and non-transparent networks have consistently abused their power to control the narrative, and have proven on many occasions to have subjective bias, inconsistency and double-standards.

The worst part is that in many cases there is no way to appeal, and even if there is, the decision is being determined by secretive processes. When a mistake is made, there is no way to correct it or present a case.

Our goal at Minds is to build a more transparent and fair moderation process to avoid these pitfalls. The purpose of this blog is to outline the foundation of this system.

Why do we need moderation?

First and foremost, let’s define the term “moderation” and describe why it is necessary.

Subscribed feeds

Subscribed feeds provide content that you have explicitly opted-in to see by electing to subscribe to a specific channel. Your newsfeed is an example of a subscribed feed.

Because subscribed feeds require user consent, there is much less of a need for moderation. There is still reason to monitor for illegal content such as child pornography or true threats of violence, but aside from that, the action of subscribing means you are opting-in to see all of their content without any filters.

Unsubscribed Feeds

Unsubscribed feeds provide content that you have not explicitly opted-in to see. Boosted content, Search results and Discovery feeds are examples of unsubscribed feeds on Minds. These feeds are displayed to everyone on the site to provide the community with the ability to discover new content and see what is trending.

Because channels have not explicitly subscribed to see this content, the default feeds may not include content that is NSFW (not safe for work). This ensures that users by default will not see content that has been deemed unsafe for the workplace.

Minds defines NSFW content to be posts containing nudity, pornography, violence and gore, profanity and/or sensitive commentary on race, religion and gender. It’s an unavoidable reality that certain content needs to be put behind these filters in order for there to be a version of Minds that is family friendly and safe for use at work. Minds will rely on user reports to identify posts that may be in violation.

All channels may easily opt-in to view NSFW content in their unsubscribed feeds. The goal is to provide the end user with total control of their feed, so that they can decide on the types of content they see (or don’t see.)

What is the Minds Jury System?

Moderation should be based on fair process and not the subjectivity of our staff. It is essential that the community has a voice in this process over what content gets moderated and what does not.

The Minds Jury System will be released on Monday, May 20th, 2019. Our goal is digital democracy and to ensure that all content moderation decisions are transparent and go through due process under the values and principles described in the Santa Clara Principles.

Initially, the Jury System will be used for appeals on moderation decisions. Every time a post is moderated (such as being marked as NSFW or spam), the user will be properly notified of the specific action and provided with the ability to appeal to give additional context as to why the decision should be changed.

The appeal will then be randomly sent to 12 unique active users on the site who are not subscribed to the reported channel. These users will be given the choice to participate, pass, or opt-out of all future jury events. If a user passes or opts-out, another random user will be notified until 12 unique channels have joined the jury and voted on the appeal.

If 75% or more of the jury votes in favor of the appeal, then the initial decision will be overturned. If less than 75% votes in favor of the appeal, then the initial decision will be upheld.

All users who participate in the jury will receive a confidence score that is determined by the number of times they vote in favor of the eventual verdict. Users who continuously vote in opposition to the eventual verdict may be disqualified from participating in future juries.

What happens if your content gets moderated?

Different actions will be taken based on the specific term that is violated.

Strike Offense

Users will receive a strike for certain term violations. Users will be notified about the strike, which term was violated, and which specific piece of content was in violation. All strikes can be appealed to a Minds Jury. Individual strikes will expire after 90 days.

The following term violations will result in a strike:

Untagged NSFW Post (three strikes required for each individual NSFW category)

* Strike 1 = Warning

* Strike 2 = 2nd Warning

* Strike 3 = Full channel marked with NSFW category

Harassment and Spam

* Strike 1 = Warning

* Strike 2 = 2nd Warning

* Strike 3 = Ban

Spam may result in an immediate ban if determined to be malicious or by use of a bot.

Immediate Ban Offense

Users will be immediately banned for certain violations. Users will be notified about the ban and which term was violated, but they will not be able to see the content that was in violation as it will have to be removed from Minds.

Appeals on immediate bans will be reviewed by the Minds admins and not a jury due to the nature of the content.

The following violations will results in an immediate ban:

- Illegal (terrorism, pedophilia, extortion, fraud, revenge porn, sex trafficking)

- Personal and confidential information (doxxing)

- Malware

- Token manipulation

- Impersonation

- Incites a true threat of violence

Summary

Minds goal is to preserve free speech and allow all content that is clearly legal under US law.

Our hope is that moderation for unsubscribed feeds in tandem with the peer-based Jury System will ensure that Minds remains a bastion of free speech while simultaneously providing users with full control over the content they see.

The Jury System ensures that all channels have the chance to appeal moderation decisions directly to their peers. If the system proves to be effective, then mistakes will be corrected and correct decisions will be upheld.

We will be updating our terms of service to reflect these changes prior to release. We look forward to all of your feedback on the system and are here to answer any questions you may have.

It’s critical to understand that all of this software is fully free and open-source, subject to peer-review and open to change. We don’t believe this is yet a perfect solution, but we do believe in this process as the most viable method of ethical technology development and moderation.

As always, thank you for being a pioneer of the free and open Internet!

The Minds Team