Content Moderation Best Practices

The following are the industry best practices recommended for any Reflected customers that allow adult content to be uploaded by users (UGC Customers):

1. Verified Uploaders Only

Only users who have verified their age and identity should be permitted to upload adult content on customers’ sites. UGC Customers must ensure that all uploaders are over the age of 18 and must use industry-standard age and identity verification services.

2. Banning Downloads

Downloading of amateur adult content should be prohibited and the feature not enabled on the customer’s platform. Any adult content downloads should only be permitted from verified content partner studio accounts.

3. Content Moderation Team

UGC Customers should have a dedicated human content moderation team that is trained to identify and take down illegal and abusive material. The content moderation team should be proactive and not only responsive to complaints. The size and composition of the team should be consistent with the size of the site and the customer’s resources.

4. Banned Words List

UGC Customers should develop and implement a list of banned / flagged words (“Banned Words List”) which trigger deletion or review by the content moderation team. The Banned Words List must be consistent with industry standards for identifying illegal or abusive content, and regularly updated.

5. Search Term Monitoring

UGC Customers should engage in ongoing moderation of their customers’ search terms in an effort to identify phrases designed to bypass existing safeguards.

6. Parental Controls

UGC Customers should label their sites in such a way that will allow the sites to be identified and blocked by standard parental control filters.

7. Automated Filters

UGC Customers should use reasonable efforts to integrate automated filters designed to detect illegal and abusive content, such as:

  • CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online
  • Content Safety API, Google's artificial intelligence tool that helps detect illegal imagery
  • PhotoDNA, Microsoft’s technology that aids in finding and removing known images of child exploitation
  • Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.

8. Abuse Reporting Tool

UGC Customers should implement a reporting form or feature that allows users and third parties to report illegal or abusive content to the site for expedited review and removal if appropriate.

9. Trusted Flagger Program

UGC Customers should implement a program designed to allow trusted third parties to identify and immediately disable access to illegal content pending review by the customer. Examples of trusted third-parties include: Cyber Civil Rights Initiative (United States of America), National Center for Missing & Exploited Children (United States of America), Internet Watch Foundation (United Kingdom), Stopline (Austria), Child Focus (Belgium), Safenet (Bulgaria), Te Protejo Hotline - I Protect You Hotline (Colombia), CZ.NIC - Stop Online (Czech Republic ), Point de Contact (France), Eco-Association of the Internet Industry (Germany), Safeline (Greece), Save the Children (Iceland), Latvian Internet Association (Latvia), Meldpunt Kinderporno - Child Pornography Reporting Point (Netherlands), Centre for Safer Internet Slovenia (Slovenia), FPB Hotline - Film and Publication Board (South Africa), ECPAT (Sweden), ECPAT (Taiwan).

10. Underage Content Reporting

UGC Customers should agree to comply with any legal obligations to report apparent underage content to appropriate agencies such as the National Center for Missing and Exploited Children (NCMEC).

11. Transparency Report

UGC Customers should publish, on no less than an annual basis, a transparency report identifying their content moderation and removal efforts.


Managed Hosting
Custom Solutions
Our Company