Moderation appeals and the processes around them are an incredibly important aspect to consider and think through carefully.
What can seem like a clear policy to an internal team can sometimes be very confusing to your platform’s users. Transparency and communication can go a long way towards de-escalating any potentially volatile situations.
This article takes you through the benefits of incorporating an appeals system—whether that be a simple email channel or a complex workflow—and details best practices in the industry.
So, how can a robust appeals process help your business?
Here’s how to make sure you stay compliant with regulations, as explained by Toshali Sengupta, policy analyst at Tremau:
What was once an optional practice is rapidly expanding as an obligation across jurisdictions under regulations aiming to increase user empowerment. The Digital Services Act (DSA), which is already being enforced, as well as the UK’s Online Safety Act (UK OSA), introduce mandatory internal complaint handling mechanisms for online platforms.
Specifically, the DSA details three levels of complaints: an internal complaint system—commonly known as an appeals process—, an out of court dispute settlement body, and judicial redress.
Here are some key things to look out for:
What does a DSA compliant appeals system look like?
What comes next?
Legal requirements aside, there is much value to adding a channel for internal complaint handling to your platform.
An obvious byproduct of taking moderation action against someone is that they may get upset. This is even more likely when the platform itself has emotional or financial significance in the person’s life.
When people cannot get an answer out of official channels, they will escalate any way that they can. This can include blasting the company on social media, personal emails or LinkedIn messages to employees and executives, threatening to take legal action, and going to the press. These types of escalations are complicated to deal with and take a lot of time—ultimately costing your business real money.
Transparency and communication can go a long way towards de-escalating a potentially volatile situation. When platforms have appeals systems with clear instructions, thorough communication, and fair and transparent processes, they’re far less likely to see escalations through other channels.
Even if a platform does their very best to ensure accurate decision-making, no system is infallible. Machine Learning systems work within an acceptable false-positive rate, and humans make mistakes no matter how much they try not to. Consulting with a machine learning development company can help refine these systems to reduce errors and improve overall performance. Sometimes moderation systems can have unintended consequences, amplifying bias against marginalized communities.
Robust appeals systems put mechanisms in place that are a check and balance to these systems, giving platforms the data that they need to make changes, whether that is providing additional training to moderation teams, updating Machine Learning algorithms, or educating users about what is appropriate to report.
What can seem like a clear policy to an internal team can sometimes be incredibly confusing to the platform’s users. If a disproportionate number of users are appealing a specific policy, it can give teams the opportunity to examine how they are educating users on their guidelines, and whether further information is needed prior to the enforcement stage.
Start with great policies and community guidelines that are easy to find, in plain language, and communicate not only what you expect of your users, but why. These policies should also be enforceable—meaning your automated systems and moderation teams are actually able to find and remove policy-violating content and behavior on your platform.
An appeals hub can be as simple as a help page with an email address, or as complicated as an automated system. What’s necessary is a dedicated area with clear information about how you make moderation decisions, what types of enforcement you take, and how users can appeal a decision. If you have a time limit for how long users can appeal after a decision is made, communicate that clearly as well.
When creating the appeals flow, consider adding some friction, but remember that the process cannot include any formal requirements. For example, if you simply include an email address on a website, that email address could get flooded with spammy or bad-faith appeals.
You may want to include a confirmation page outlining your policies or a text box to add context to the appeal, which may slow down spammers. Additionally, consider having a mechanism to only allow one appeal per decision rather than unlimited open-ended appeals.
People who are appealing moderation decisions want to be sure that their complaint is being taken seriously, and often this means that they’d like a human to review the evidence. The DSA also requires that humans review a decision that was made automatically to begin with.
This means that communications should clearly indicate when a human is involved, and should use language that is personalized to the complaint. Include links to policies and guidelines, as well as any evidence (if possible).
Ideally, teams reviewing appeals should be separate from the teams making the initial decisions. This allows for more objective decision-making and constructive feedback. At a minimum, you should ensure that no moderator ever reviews their own decisions.
The DSA requires response to appeals to be “swift”, which is also best practice for preventing escalations through other means. However, if you respond too quickly to an appeal (for example, a few minutes after an appeal is filed), people may think that you haven’t taken enough time to properly investigate, and will complain or escalate further.
Pay attention to feedback from your appeals process to find the sweet spot that shows you are responding swiftly, but also taking the time to make a thorough decision.
When communicating final moderation decisions, there is a very real risk that people who don’t like the answer will push back. If this bad news is linked to a specific employee (for example, through a customer support inbox using an agent’s real name or photo) then this can put the agent at risk. A first name and company name can be searched on LinkedIn to find location, photos, and more.
Best practice is that any communication about moderation decisions should be done through anonymous means, either through a generic “company” account, or using fake likenesses for the support agents or moderators handling the appeal.
Most people understand why moderation is important, and they also know that sometimes people make mistakes. Apologizing when you make a moderation error can go a long way. If you’re able to offer a free subscription as a token of thanks for understanding, even better. Sometimes users can come out of the situation with even more respect for the moderation process.
Once you’ve made the final decision, ensure that there can’t be any more back-and-forth communication. If your appeal system is through email, this could look like an explanation in the final decision message which says that this will be the last communication, and any replies will be automatically closed. Be clear with setting expectations about what the appeal process covers and does not cover.
Do you need help responding to appeals? PartnerHero can help! Reach out to a solutions specialist to find out more.