The combined forum and Discord community is the best “guild” of people I’ve ever encountered. If you are reading this, you are one of these community members we’re referring to that is extra-awesome. Everyone helps each other, is friendly, knowledgeable, resourceful, and we almost never need to ban anyone here. It’s a haven.
However, every now and then there will be players in-game that bring about toxicity in some form. I was reviewing our Steam reviews and most issues seem to involve “toxicity” (in some form) for those that are not part of our guild.
Due to this, we want you guys to have an awesome experience and have crafted what may arguably be one of the most fluent and transparent human moderation systems that indie games can offer (more info below):
While we’re super proud of our system and you can see that your reports do something in the Discord #justice channel, there’s one major double-edged sword:
Let’s discuss some possibilities to potentially [and reasonably] alter our system where we can moderate without developer backlash. Some players will even revenge review for just a warning or a 1-day suspension. Sometimes with up to 800 hours logged.
Up until now, we have been providing additional awareness factors for proof of such a review, but this requires investigation, research, and time. We want to try thinking deeper to think how we can avoid this.
As a community-based game, we are reaching out to our players for ideas.
Here is what we have tried:
Provide a screenshot with proof of logs (to prevent “Banned for nothing” reviews)
Added additional information within suspension messages (date/time unbanned)
Unless extra-bad, scale from warning >> 1d suspension >> 3d >> 7d >> etc.
Here is what we CANNOT do:
Some of these may be obvious, as we need to respect Valve’s rules; but just in case, we cannot:
Ask for a review in-game (like annoying mobile apps)
Exchange something for a review
Issue disciplinary action for a revenge review
Here are the ideas we’ve had, but passed on:
Bad guys queue instead of a suspension. We can’t do this because it takes 16 players for a game and we don’t have that many players suspended at once. Generally, after the first suspension or warning, most players STOP being toxic (which is a good thing). This would only work for ~2 player games.
Only issue warnings. If there’s no real suspension, people will just keep going.
Current ideas that may work:
Instead of suspensions, we ONLY issue warnings. After 3 strikes within x amount of time, players are suspended for 1 year.
Continue with current system, but :insert some really smart change:
Our system works well, but we need a way to prevent revenge reviews that also complies with Valve (Steam)'s terms of service. Revenge reviews are really making the devs stressed out and requires additional resources that we could be using to make the game better.
The better we moderate, the more potential for revenge reviews. This makes moderation a bit tough – we need some changes, somehow. We could use your help surveying:
Do you like or dislike the way we moderate?
What are your thoughts about how we handle moderation?
How would you improve our moderation system?
Psychologically speaking, let’s dive into their mind:
If you were suspended for 1-day for racism, what would discourage you from revenge reviewing?
Any other ideas to allow us to moderate without potentially receiving backlash?