Install and run Docker containers for the Anomaly Detector API
Important
Starting on the 20th of September, 2023 you won’t be able to create new Anomaly Detector resources. The Anomaly Detector service is being retired on the 1st of October, 2026.
Note
The container image location has recently changed. Read this article to see the updated location for this container.
Containers enable you to use the Anomaly Detector API your own environment. Containers are great for specific security and data governance requirements. In this article you'll learn how to download, install, and run an Anomaly Detector container.
Anomaly Detector offers a single Docker container for using the API on-premises. Use the container to:
- Use the Anomaly Detector's algorithms on your data
- Monitor streaming data, and detect anomalies as they occur in real-time.
- Detect anomalies throughout your data set as a batch.
- Detect trend change points in your data set as a batch.
- Adjust the anomaly detection algorithm's sensitivity to better fit your data.
For detailed information about the API, please see:
If you don't have an Azure subscription, create a Trial before you begin.
Prerequisites
You must meet the following prerequisites before using Anomaly Detector containers:
Required | Purpose |
---|---|
Docker Engine | You need the Docker Engine installed on a host computer. Docker provides packages that configure the Docker environment on macOS, Windows, and Linux. For a primer on Docker and container basics, see the Docker overview. Docker must be configured to allow the containers to connect with and send billing data to Azure. On Windows, Docker must also be configured to support Linux containers. |
Familiarity with Docker | You should have a basic understanding of Docker concepts, like registries, repositories, containers, and container images, as well as knowledge of basic docker commands. |
Anomaly Detector resource | In order to use these containers, you must have: An Azure Anomaly Detector resource to get the associated API key and endpoint URI. Both values are available on the Azure portal's Anomaly Detector Overview and Keys pages and are required to start the container. {API_KEY}: One of the two available resource keys on the Keys page {ENDPOINT_URI}: The endpoint as provided on the Overview page |
Gather required parameters
Three primary parameters for all Azure AI containers are required. The Microsoft Software License Terms must be present with a value of accept. An Endpoint URI and API key are also needed.
Endpoint URI
The {ENDPOINT_URI}
value is available on the Azure portal Overview page of the corresponding Azure AI services resource. Go to the Overview page, hover over the endpoint, and a Copy to clipboard icon appears. Copy and use the endpoint where needed.
Keys
The {API_KEY}
value is used to start the container and is available on the Azure portal's Keys page of the corresponding Azure AI services resource. Go to the Keys page, and select the Copy to clipboard icon.
Important
These subscription keys are used to access your Azure AI services API. Don't share your keys. Store them securely. For example, use Azure Key Vault. We also recommend that you regenerate these keys regularly. Only one key is necessary to make an API call. When you regenerate the first key, you can use the second key for continued access to the service.
The host computer
The host is an x64-based computer that runs the Docker container. It can be a computer on your premises or a Docker hosting service in Azure, such as:
- Azure Kubernetes Service.
- Azure Container Instances.
- A Kubernetes cluster deployed to Azure Stack. For more information, see Deploy Kubernetes to Azure Stack.
Container requirements and recommendations
The following table describes the minimum and recommended CPU cores and memory to allocate for Anomaly Detector container.
QPS(Queries per second) | Minimum | Recommended |
---|---|---|
10 QPS | 4 core, 1-GB memory | 8 core 2-GB memory |
20 QPS | 8 core, 2-GB memory | 16 core 4-GB memory |
Each core must be at least 2.6 gigahertz (GHz) or faster.
Core and memory correspond to the --cpus
and --memory
settings, which are used as part of the docker run
command.
Get the container image with docker pull
The Anomaly Detector container image can be found on the mcr.microsoft.com
container registry syndicate. It resides within the azure-cognitive-services/decision
repository and is named anomaly-detector
. The fully qualified container image name is mcr.microsoft.com/azure-cognitive-services/decision/anomaly-detector
.
To use the latest version of the container, you can use the latest
tag. You can also find a full list of image tags on the MCR.
Use the docker pull
command to download a container image.
Container | Repository |
---|---|
cognitive-services-anomaly-detector | mcr.microsoft.com/azure-cognitive-services/decision/anomaly-detector:latest |
Tip
When using docker pull
, pay close attention to the casing of the container registry, repository, container image name and corresponding tag. They are case sensitive.
Tip
You can use the docker images command to list your downloaded container images. For example, the following command lists the ID, repository, and tag of each downloaded container image, formatted as a table:
docker images --format "table {{.ID}}\t{{.Repository}}\t{{.Tag}}"
IMAGE ID REPOSITORY TAG
<image-id> <repository-path/name> <tag-name>
Docker pull for the Anomaly Detector container
docker pull mcr.microsoft.com/azure-cognitive-services/anomaly-detector:latest
How to use the container
Once the container is on the host computer, use the following process to work with the container.
- Run the container, with the required billing settings. More examples of the
docker run
command are available. - Query the container's prediction endpoint.
Run the container with docker run
Use the docker run command to run the container. Refer to gather required parameters for details on how to get the {ENDPOINT_URI}
and {API_KEY}
values.
Examples of the docker run
command are available.
docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \
mcr.microsoft.com/azure-cognitive-services/decision/anomaly-detector:latest \
Eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}
This command:
- Runs an Anomaly Detector container from the container image
- Allocates one CPU core and 4 gigabytes (GB) of memory
- Exposes TCP port 5000 and allocates a pseudo-TTY for the container
- Automatically removes the container after it exits. The container image is still available on the host computer.
Important
The Eula
, Billing
, and ApiKey
options must be specified to run the container; otherwise, the container won't start. For more information, see Billing.
Running multiple containers on the same host
If you intend to run multiple containers with exposed ports, make sure to run each container with a different port. For example, run the first container on port 5000 and the second container on port 5001.
Replace the <container-registry>
and <container-name>
with the values of the containers you use. These do not have to be the same container. You can have the Anomaly Detector container and the LUIS container running on the HOST together or you can have multiple Anomaly Detector containers running.
Run the first container on host port 5000.
docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \
<container-registry>/microsoft/<container-name> \
Eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}
Run the second container on host port 5001.
docker run --rm -it -p 5001:5000 --memory 4g --cpus 1 \
<container-registry>/microsoft/<container-name> \
Eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}
Each subsequent container should be on a different port.
Query the container's prediction endpoint
The container provides REST-based query prediction endpoint APIs.
Use the host, http://localhost:5000, for container APIs.
Validate that a container is running
There are several ways to validate that the container is running. Locate the External IP address and exposed port of the container in question, and open your favorite web browser. Use the various request URLs that follow to validate the container is running. The example request URLs listed here are http://localhost:5000
, but your specific container might vary. Make sure to rely on your container's External IP address and exposed port.
Request URL | Purpose |
---|---|
http://localhost:5000/ |
The container provides a home page. |
http://localhost:5000/ready |
Requested with GET, this URL provides a verification that the container is ready to accept a query against the model. This request can be used for Kubernetes liveness and readiness probes. |
http://localhost:5000/status |
Also requested with GET, this URL verifies if the api-key used to start the container is valid without causing an endpoint query. This request can be used for Kubernetes liveness and readiness probes. |
http://localhost:5000/swagger |
The container provides a full set of documentation for the endpoints and a Try it out feature. With this feature, you can enter your settings into a web-based HTML form and make the query without having to write any code. After the query returns, an example CURL command is provided to demonstrate the HTTP headers and body format that's required. |
Stop the container
To shut down the container, in the command-line environment where the container is running, select Ctrl+C.
Troubleshooting
If you run the container with an output mount and logging enabled, the container generates log files that are helpful to troubleshoot issues that happen while starting or running the container.
Tip
For more troubleshooting information and guidance, see Azure AI containers frequently asked questions (FAQ).
If you're having trouble running an Azure AI services container, you can try using the Microsoft diagnostics container. Use this container to diagnose common errors in your deployment environment that might prevent Azure AI containers from functioning as expected.
To get the container, use the following docker pull
command:
docker pull mcr.microsoft.com/azure-cognitive-services/diagnostic
Then run the container. Replace {ENDPOINT_URI}
with your endpoint, and replace {API_KEY}
with your key to your resource:
docker run --rm mcr.microsoft.com/azure-cognitive-services/diagnostic \
eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}
The container will test for network connectivity to the billing endpoint.
Billing
The Anomaly Detector containers send billing information to Azure, using an Anomaly Detector resource on your Azure account.
Queries to the container are billed at the pricing tier of the Azure resource that's used for the ApiKey
parameter.
Azure AI services containers aren't licensed to run without being connected to the metering or billing endpoint. You must enable the containers to communicate billing information with the billing endpoint at all times. Azure AI services containers don't send customer data, such as the image or text that's being analyzed, to Azure.
Connect to Azure
The container needs the billing argument values to run. These values allow the container to connect to the billing endpoint. The container reports usage about every 10 to 15 minutes. If the container doesn't connect to Azure within the allowed time window, the container continues to run but doesn't serve queries until the billing endpoint is restored. The connection is attempted 10 times at the same time interval of 10 to 15 minutes. If it can't connect to the billing endpoint within the 10 tries, the container stops serving requests.
Billing arguments
The docker run
command will start the container when all three of the following options are provided with valid values:
Option | Description |
---|---|
ApiKey |
The API key of the Azure AI services resource that's used to track billing information. The value of this option must be set to an API key for the provisioned resource that's specified in Billing . |
Billing |
The endpoint of the Azure AI services resource that's used to track billing information. The value of this option must be set to the endpoint URI of a provisioned Azure resource. |
Eula |
Indicates that you accepted the license for the container. The value of this option must be set to accept. |
For more information about these options, see Configure containers.
Summary
In this article, you learned concepts and workflow for downloading, installing, and running Anomaly Detector containers. In summary:
- Anomaly Detector provides one Linux container for Docker, encapsulating anomaly detection with batch vs streaming, expected range inference, and sensitivity tuning.
- Container images are downloaded from a private Azure Container Registry dedicated for containers.
- Container images run in Docker.
- You can use either the REST API or SDK to call operations in Anomaly Detector containers by specifying the host URI of the container.
- You must specify billing information when instantiating a container.
Important
Azure AI containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to communicate billing information with the metering service at all times. Azure AI containers do not send customer data (e.g., the time series data that is being analyzed) to Microsoft.
Next steps
- Review Configure containers for configuration settings
- Deploy an Anomaly Detector container to Azure Container Instances
- Learn more about Anomaly Detector API service