skeletonbow: … The best thing GOG could do at this point is to remove the Rep score completely, and replace the rate up/down buttons with a single "Report" button to report posts which are believed to violate GOG's terms of service agreement, and a dialogue that queries you as to what the violation/problem is giving multiple-choice options based on the actual terms of services, similar to how Twitch.tv does it. Then, when someone reports someone for a violation, keep track of an internal score nobody else can see in which if someone reports someone for a violation and it isn't a violation or just seems vindictive, they get a -1 to their internal score, and if they report a legit violation they get a +1 internal score.
Have all reports put in a queue, where each report is weighted based on the internal accumulated reputation of the people reporting posts, and act upon them in-order. Posts that are only reported by people with very low/negative scores get the lowest priority to investigate, and those who consistently issue false reports end up at some negative threshold eventually being completely untrusted in their ability to report legitimate violations. …
scientiae: This is actually a good idea. It's not perfect (since even an untrustworthy account may legitimately capture a violation), but it is certainly both economical and sensible. It would pay for itself pretty quickly, if there were a lot of reports to police. (I'm not sure that is the case, however.)
That's fine though because if someone with a low trust actually reports something properly which is in violation, the chances are that there will be many more people reporting it too. So the combination trust of all the people who submit a report is what will affect its position in the queue. If the report gets approved as a violation, then everyone who reported it has their score elevated including those with low trust, so they can still gain trust.
But people who abuse the system more than they use it properly will always end up with a lower trust metric. If someone has a change of face eventually and starts being more responsible with how they use the forums and report things, then they'll gradually gain system trust over time again.
The system would be a sliding trust metric, with some amount higher weight placed on more recent activity than older activity, thus rewarding people who contribute in a positive manner to the system, even if they were the opposite at a previous point in time.
The exact algorithms and how they're tweaked would be an internal thing that is adjusted over time based on how well it works to get the desired results, and to manage corner cases or any ways in which someone figures out a way to abuse the system.