Moderation of content is a way to control unrelated, obscene, or illegal content submitted by users. Unacceptable content that is predetermined by a set of rules and guidelines created by the author of the forum or website can manually or automatically remove unwanted content.
There is a wide variety of moderation types that a community moderator or manager may want to incorporate into a successful forum or rating system. When approaching a method of maintaining order and relevant content, consider the following Top 5 most popular moderation topics:
When a community member submits content for a queue a manager or moderator performs pre-moderation. The moderator has the ability to verify the content before it’s posted to other members to ensure that the content meets community standards. Although pre-moderation is an excellent method of controlling community content, it can cause a number of cons for the community members. Not only do members have to wait for their comments to clear before their posts are released, but the cost of managing this type of content moderation can also be high. Furthermore, if the community is large or the site grows, these tasks may become overwhelming for moderators or managers. The delay could cause a lack of interest and members to leave the community altogether.
Communities at a higher risk of legal ramifications may be the target for such a pre-moderation system. Websites like celebrity forums or child-related sites are likely the only ones to benefit from this time-consuming and costly moderation approach.
Because active moderation is necessary, post-moderation can be a useful option in comparison to the pre-moderation technique. Community contributors can see their content displayed immediately after they submit their comments. Moderators or managers can view comments after users post them to ensure the content meets rules and regulations set forth by site administration.
This type of moderation system enables items to be posted by community members on a real-time basis. A moderation model designed around post-moderation ensures communities are faster paced and keeps interests in the community going. Because most are savvy in the social network world, instant gratification is sometimes required to keep users coming back. When it comes to interacting on the web, post-moderation keeps conversations going while keeping security at the forefront.
With all this successful moderation may come growth, and the cost to maintain this type of system can become difficult to manage. Furthermore, every item of content that is viewed and then released for approval can become the legal obligation of the website owner or operator. The risk involved may be a risk too high to combat. Because of the possible legal ramifications, the post-moderation should be utilized for only a short time.
3. Reactive moderation
Community members who can flag comments that may not fall in line with the rules and regulations of the site are called a Reactive moderation system. Reactive moderation is also a tool administrators use along with pre- and post-moderation type systems and is viewed as a kind of ‘safety net’ for instances where members see violations before moderators.
Also known as sole moderation, reactive moderation gives members the option and sometimes duty to report or flag content they deem not suitable and in violation of house rules and regulations. In this process, a report icon or button is located near the comment so that users can easily flag words or entire posts that do not fall in line with the site’s narrative or forum rules.
When a user of the community flags content and submits to be evaluated, this allows moderators or managers to view the content and approve or delete it. When flagging or reporting content is allowed and enabled on forum sites, the users assist moderators at no additional cost.
However, relying on this type of reactive moderation system can still prove damaging to some brands and companies as the content may still be viewable. Depending on the quickness of the moderator’s reaction time to reports, content left unchecked can be detrimental to the reputation of the company.
4. Distributed moderation
Although rare, distributed moderation is a type of content distributed by users with the moderation method. It is a system that counts on rating systems that members of the forum community can use to vote up or down a comment and determine whether or not other user’s submissions adhere to community guidelines and other expectations.
Distributed moderation gives the most control to its users and community members. Giving more freedom to members can save time; however, anticipating that members can self-moderate is tricky. When members have more control, this can mean less control of the content.
Expecting the community to self-moderate are very rarely a direction companies are willing to take, for legal and branding reasons. Brands using this type of system are, SocialMod and Slashdot, who utilize Mechanical Turk leveraged from Amazon to moderate using a plethora of workers to process all content.
5. Automated moderation
To reduce cost and timely moderation systems, automated moderation can be the number one key to deploying technical tools of various processes that rely on moderation systems, and automated moderation is a valuable weapon in the moderator’s arsenal. It consists of deploying various technical tools to process user-generated content, or UGC, which applies a set of rules to approve and reject content submissions. Systems such as this company offers work well.
A commonly used tool in the UGC arsenal is word filters. Word filter does just that; it bans certain words that get submitted by users and determines if the words meet or go against the set of rules established. Word filters can offer alternative wording suggestions or replace unwanted word phrases with predefined alternatives.
Another incredible tool offered for automated moderation systems is an IP block. This tool can block submissions altogether and even block specific IP addresses from submitting content in the future. Brands such as Crisp Thinking utilize this tool within their forums successfully.
In closing, it’s important to promote moderation as key to a successful forum or brand in terms of trust and success. Having no moderation could leave members and your brand open to negative interpretations and leave members vulnerable to spam and other harmful consequences. Websites and forums are well-maintained when moderation systems are in place and serve to lessen the workload while offering freedom to community members at the same time.