Content Moderator documentation

The Azure Content Moderator API checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users.

About Content Moderator

Overview

Use custom image lists

Quickstart

How-To Guide

Use custom terms lists

Quickstart

How-To Guide

Analyze video content

How-To Guide

Help and feedback

Reference