Improve your detection and simplify moderation - in one AI-powered platform.
Stay ahead of novel risks and bad actors with proactive, on-demand insights.
Proactively stop safety gaps to produce safe, reliable, and compliant models.
Deploy generative AI in a safe and scalable way with active safety guardrails.
Online abuse has countless forms. Understand the types of risks Trust & Safety teams must keep users safe from on-platform.
Protect your most vulnerable users with a comprehensive set of child safety tools and services.
Our out-of-the-box solutions support platform transparency and compliance.
Keep up with T&S laws, from the Online Safety Bill to the Online Safety Act.
Over 70 elections will take place in 2024: don't let your platform be abused to harm election integrity.
Protect your brand integrity before the damage is done.
From privacy risks, to credential theft and malware, the cyber threats to users are continuously evolving.
Stay ahead of industry news in our exclusive T&S community.
To truly foster a sense of safety and belonging for everyone, Trust & Safety teams must prioritize inclusivity from the start. ActiveFence has identified eleven ways for teams to ensure that diverse groups can authentically and freely interact online, building robust communities where all have a chance to thrive.
Creating inclusive online spaces is at the heart of Trust & Safety. As teams know best, that means ensuring a sense of safety and belonging for everyone. In an environment where online communities are becoming increasingly divisive and often hostile towards minorities, diverse perspectives and representation are more important than ever. Ultimately, creating a safe space for all will attract more users and allow your business to thrive.
Though the task may be challenging, it’s on Trust & Safety teams to set new standards for access to all online. Inspired by Pride Month, ActiveFence has chosen eleven ways for Trust & Safety teams to create inclusive spaces on their platforms for diverse minority communities.
It’s important to note that inclusivity may look different from one platform to another based on modes of communication. Text-heavy platforms may require more diversified content, while platforms with images and videos may require diversified imagery. The nature of platforms will generate different requirements as well. One-to-one messaging platforms will not require many measures, while a dating application requires different options and protections. Gaming platforms will have the additional need to ensure that imagery representing users, such as avatars, is diverse.
Despite platform-specific dependencies, all platforms must prioritize inclusivity from the start.
Beginning with the company itself to the ongoing maintenance of fostering a healthy community, Trust & Safety teams can build inclusivity into a platform. Here are eleven ways to accomplish this.
Inclusivity starts in the workplace. Fair employee policies and a culture that celebrates differences are just a few ways to ensure a diverse workplace. This will attract a wide range of employees, ultimately reflecting in the product itself. From a better understanding of customers to new ideas, inclusive workplaces create inclusive products.
According to Microsoft, Inclusive Design is a “methodology, born out of digital environments, that enables and draws on the full range of human diversity.” In practice, this idea moves away from the one-size-fits-all approach to the principles of Safety by design, and providing the best individual user experience for as many people as possible. People of all walks of life can engage by addressing the rarest or most extreme needs- from differences in ability, age, gender, or language.
Without ensuring platform safety for everyone, inclusivity cannot happen. A study by the Anti Defamation League in 2021 found that 64% of LGBTQ+ respondents reported experiencing harassment online compared to 41% of all other demographics. Trust & Safety teams can help.
To start, clear policy against discrimination, hate speech, and bullying is necessary for effective moderation. Content detection tools, AI, intelligence, and moderators must be up to par to protect users. Policy must be swiftly enforced for non-compliant users.
Read about enforcement best practices in our blog, Policy Enforcement: A Nuanced Approach.
Furthermore, teams must ensure that content moderation mechanisms don’t discriminate against the populations they are trying to protect. For example, slang or derogatory names might be used by a user in a hateful way against another user, while in a different context, a friend may use the same word in a friendly manner towards another user.
Only sixteen percent of the world population speaks English, while 60% of all web content is in English. By making platforms accessible in more than one language, platforms can reach far more users who will contribute to a healthier, more inclusive online world. Additionally, as a general rule, platforms should speak in simple language to make it easier to understand for all readers. Jargon, slang, or culturally exclusive language should be avoided.
Gender-neutral terms should be used throughout platforms and any content promoted on a platform. Additionally, allowing users to choose how they would like to represent themselves, such as asking for preferred pronouns, makes a platform more user-friendly. However, it should be optional for users to do this. Required fields to create an account should also exclude sensitive data such as race and gender. In the end, it’s about facilitating choice of self-expression.
Visual representation is key to inclusivity. Images throughout platforms, emojis, and a selection of character representations are just a few areas that should include diverse imagery. Stock photo collections like the Gender Spectrum Collection, Show Us, and No Apologies Collection makes it easy for platforms to find a variety of images. Additionally, where appropriate, such as in gaming, users should be able to select and mix a variety of genders, skin tones, and other external signifiers.
Celebrating awareness days and months, culturally diverse art and history, and national holidays are easy ways to welcome a range of communities and share cultures with users of all backgrounds. Furthermore, platform content should be mindful of different cultures and populations when targeting users. Content, such as ads, may be relevant for one user, while entirely unrelated to another.
Additionally, inclusive platforms should be easy to find. SEO best practices that target different communities will make it easier for audiences to find platforms. External links, keywords throughout content, and meta-tags will boost your platform in search results.
Social biases color not only our society but artificial intelligence as well. Datasets often contain generalizations that exclude different skin colors, languages, vocabulary, cultures, or genders, creating discrimination on platforms. For example, machine vision technology may only work for small subsets of users based on race, affecting video detection. Platform searches may suggest biased results that exclude underrepresented populations.
However, inclusive AI is advancing. Algorithms can be debiased by changing initial training datasets and shaping a more equal and inclusive future. Trust & Safety teams must ensure that this is a priority among developers.
Trust & Safety teams should be measuring success of their moderation efforts to improve all moderation activities. In this case, teams should pay attention to instances of abuses caused by discrimination such as hate speech or bullying. Furthermore, when platforms test product changes, incorporating measurements for discrimination should be a priority. Platforms should also welcome feedback from users to help them understand their experiences.
Platforms have the power to influence more than the online world but greater society. This June, promote LBGTQ+ organizations with fundraising or awareness campaigns. Whatever you decide to contribute, show your audience how you’re helping.
Teams must stay educated. Platforms will only remain inclusive if teams understand their users. This process involves following advocacy organizations, staying up to date on evolving terminology, supporting policies that promote diversity in the workplace, continuously asking for employee and user feedback, and educating your team.
Creating an inclusive platform comes down to awareness, safety, and empowerment. Without understanding different populations and their needs, platforms cannot make a change. Using this knowledge, it’s the responsibility of Trust & Safety teams to create safety so an environment of inclusivity can begin to develop. Lastly is empowerment. User choice, self-representation, and access to diverse content will empower users to interact on the platform authentically and freely, building the dynamic social environments where all thrive.
Want to build safer, more inclusive platforms? Download our Trust & Safety Buyer’s Guide to learn how.
The decision to build or buy content moderation tools is a crucial one for many budding Trust and Safety teams. Learn about our five-point approach to this complex decision.
Recognized by Frost & Sullivan, ActiveFence addresses AI-generated content while enhancing Trust and Safety in the face of generative AI.
Building your own Trust and Safety tool? This blog breaks down for Trust and Safety teams the difference between building, buying, or using a hybrid approach.