A service for automatic analysis of an uploaded image and perform actions based on the results. Results are broken down into Categories, such as partial nudity, nudity and explicit material, which are automatically moderated.
Any detected category within an image will also include a percentage value to identify the reliability of the outcome. You have control over how confident the system is before performing an action on detected content with the accuracy setting.
Currently, Image Moderation can perform the following actions:
Whitelist - Automatic Approval. Used for situations where you allow images containing certain content to be approved. E.g. Allowing Images of the human body to be submitted to a biological discussion board.
Blacklist - Moderation Failed. An image containing a category that has been blacklisted will be blocked automatically. E.g. Prevent graphic content from being submitted to your public website.
Greylist - A "manual approval" feature. Rather than approve/reject an image automatically, greylisted images will be stored for review by the user, who chooses what action to take. E.g. Beach Photos: Swimming Trunks Yes, Budgie Smugglers possibly.
Defining Category Action
New accounts come with preconfigured image settings. This configuration will provide reliable all round protection against nudity and explicit material.
To customise the settings, clear the existing definition by clicking the 'Clear All' button at the bottom of the screen. Alternatively click the category markers within the 3 action lists, or choose a new one from the green category panel at the top of the page:
- Click a category
- Select the action you would like to take for this type of content
- Adjust the accuracy slider to fine tune your moderator (We recommend 75%-80% as a starting point.)
Remember to save your settings before leaving the page.