Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In this article
Learn how to deploy a model from Azure Machine Learning as a function app in Azure Functions.
Important
While both Azure Machine Learning and Azure Functions are generally available, the ability to package a model from the Machine Learning service for Functions is in preview.
With Azure Machine Learning, you can create Docker images from trained machine learning models. Azure Machine Learning now has the preview functionality to build these machine learning models into function apps, which can be deployed into Azure Functions.
An Azure Machine Learning workspace. For more information, see the Create a workspace article.
The Azure CLI.
A trained machine learning model registered in your workspace. If you do not have a model, use the Image classification tutorial: train model to train and register one.
Important
The code snippets in this article assume that you have set the following variables:
ws
- Your Azure Machine Learning workspace.model
- The registered model that will be deployed.inference_config
- The inference configuration for the model.
For more information on setting these variables, see Deploy models with Azure Machine Learning.
Before deploying, you must define what is needed to run the model as a web service. The following list describes the core items needed for a deployment:
An entry script. This script accepts requests, scores the request using the model, and returns the results.
Important
The entry script is specific to your model; it must understand the format of the incoming request data, the format of the data expected by your model, and the format of the data returned to clients.
If the request data is in a format that is not usable by your model, the script can transform it into an acceptable format. It may also transform the response before returning to it to the client.
By default when packaging for functions, the input is treated as text. If you are interested in consuming the raw bytes of the input (for instance for Blob triggers), you should use AMLRequest to accept raw data.
For more information on entry script, see Define scoring code
- Dependencies, such as helper scripts or Python/Conda packages required to run the entry script or model
These entities are encapsulated into an inference configuration. The inference configuration references the entry script and other dependencies.
Important
When creating an inference configuration for use with Azure Functions, you must use an Environment object. Please note that if you are defining a custom environment, you must add azureml-defaults with version >= 1.0.45 as a pip dependency. This package contains the functionality needed to host the model as a web service. The following example demonstrates creating an environment object and using it with an inference configuration:
from azureml.core.environment import Environment
from azureml.core.conda_dependencies import CondaDependencies
# Create an environment and add conda dependencies to it
myenv = Environment(name="myenv")
# Enable Docker based environment
myenv.docker.enabled = True
# Build conda dependencies
myenv.python.conda_dependencies = CondaDependencies.create(conda_packages=['scikit-learn'],
pip_packages=['azureml-defaults'])
inference_config = InferenceConfig(entry_script="score.py", environment=myenv)
For more information on environments, see Create and manage environments for training and deployment.
For more information on inference configuration, see Deploy models with Azure Machine Learning.
Important
When deploying to Functions, you do not need to create a deployment configuration.
To build packages for Azure Functions, you must install the SDK preview package.
pip install azureml-contrib-functions
To create the Docker image that is deployed to Azure Functions, use azureml.contrib.functions.package or the specific package function for the trigger you are interested in using. The following code snippet demonstrates how to create a new package with a blob trigger from the model and inference configuration:
Note
The code snippet assumes that model
contains a registered model, and that inference_config
contains the configuration for the inference environment. For more information, see Deploy models with Azure Machine Learning.
from azureml.contrib.functions import package
from azureml.contrib.functions import BLOB_TRIGGER
blob = package(ws, [model], inference_config, functions_enabled=True, trigger=BLOB_TRIGGER, input_path="input/{blobname}.json", output_path="output/{blobname}_out.json")
blob.wait_for_creation(show_output=True)
# Display the package location/ACR path
print(blob.location)
When show_output=True
, the output of the Docker build process is shown. Once the process finishes, the image has been created in the Azure Container Registry for your workspace. Once the image has been built, the location in your Azure Container Registry is displayed. The location returned is in the format <acrinstance>.azurecr.cn/package@sha256:<imagename>
.
Note
Packaging for functions currently supports HTTP Triggers, Blob triggers and Service bus triggers. For more information on triggers, see Azure Functions bindings.
Important
Save the location information, as it is used when deploying the image.
Use the following command to get the login credentials for the Azure Container Registry that contains the image. Replace
<myacr>
with the value returned previously fromblob.location
:az acr credential show --name <myacr>
The output of this command is similar to the following JSON document:
{ "passwords": [ { "name": "password", "value": "Iv0lRZQ9762LUJrFiffo3P4sWgk4q+nW" }, { "name": "password2", "value": "=pKCxHatX96jeoYBWZLsPR6opszr==mg" } ], "username": "myml08024f78fd10" }
Save the value for username and one of the passwords.
If you do not already have a resource group or app service plan to deploy the service, the following commands demonstrate how to create both:
az group create --name myresourcegroup --location "China East 2" az appservice plan create --name myplanname --resource-group myresourcegroup --sku B1 --is-linux
In this example, a Linux basic pricing tier (
--sku B1
) is used.Important
Images created by Azure Machine Learning use Linux, so you must use the
--is-linux
parameter.Create the storage account to use for the web job storage and get its connection string. Replace
<webjobStorage>
with the name you want to use.az storage account create --name <webjobStorage> --location chinaeast2 --resource-group myresourcegroup --sku Standard_LRS
az storage account show-connection-string --resource-group myresourcegroup --name <webJobStorage> --query connectionString --output tsv
To create the function app, use the following command. Replace
<app-name>
with the name you want to use. Replace<acrinstance>
and<imagename>
with the values from returnedpackage.location
earlier. Replace<webjobStorage>
with the name of the storage account from the previous step:az functionapp create --resource-group myresourcegroup --plan myplanname --name <app-name> --deployment-container-image-name <acrinstance>.azurecr.cn/package:<imagename> --storage-account <webjobStorage>
Important
At this point, the function app has been created. However, since you haven't provided the connection string for the blob trigger or credentials to the Azure Container Registry that contains the image, the function app is not active. In the next steps, you provide the connection string and the authentication information for the container registry.
Create the storage account to use for the blob trigger storage and get its connection string. Replace
<triggerStorage>
with the name you want to use.az storage account create --name <triggerStorage> --location chinaeast2 --resource-group myresourcegroup --sku Standard_LRS
az storage account show-connection-string --resource-group myresourcegroup --name <triggerStorage> --query connectionString --output tsv
Record this connection string to provide to the function app. We will use it later when we ask for
<triggerConnectionString>
Create the containers for the input and output in the storage account. Replace
<triggerConnectionString>
with the connection string returned earlier:az storage container create -n input --connection-string <triggerConnectionString>
az storage container create -n output --connection-string <triggerConnectionString>
To associate the trigger connection string with the function app, use the following command. Replace
<app-name>
with the name of the function app. Replace<triggerConnectionString>
with the connection string returned earlier:az functionapp config appsettings set --name <app-name> --resource-group myresourcegroup --settings "TriggerConnectionString=<triggerConnectionString>"
You will need to retrieve the tag associated with the created container using the following command. Replace
<username>
with the username returned earlier from the container registry:az acr repository show-tags --repository package --name <username> --output tsv
Save the value returned, it will be used as the
imagetag
in the next step.To provide the function app with the credentials needed to access the container registry, use the following command. Replace
<app-name>
with the name of the function app. Replace<acrinstance>
and<imagetag>
with the values from the AZ CLI call in the previous step. Replace<username>
and<password>
with the ACR login information retrieved earlier:az functionapp config container set --name <app-name> --resource-group myresourcegroup --docker-custom-image-name <acrinstance>.azurecr.cn/package:<imagetag> --docker-registry-server-url https://<acrinstance>.azurecr.cn --docker-registry-server-user <username> --docker-registry-server-password <password>
This command returns information similar to the following JSON document:
[ { "name": "WEBSITES_ENABLE_APP_SERVICE_STORAGE", "slotSetting": false, "value": "false" }, { "name": "DOCKER_REGISTRY_SERVER_URL", "slotSetting": false, "value": "https://myml08024f78fd10.azurecr.cn" }, { "name": "DOCKER_REGISTRY_SERVER_USERNAME", "slotSetting": false, "value": "myml08024f78fd10" }, { "name": "DOCKER_REGISTRY_SERVER_PASSWORD", "slotSetting": false, "value": null }, { "name": "DOCKER_CUSTOM_IMAGE_NAME", "value": "DOCKER|myml08024f78fd10.azurecr.cn/package:20190827195524" } ]
At this point, the function app begins loading the image.
Important
It may take several minutes before the image has loaded. You can monitor progress using the Azure Portal.
Once the image has loaded and the app is available, use the following steps to trigger the app:
Create a text file that contains the data that the score.py file expects. The following example would work with a score.py that expects an array of 10 numbers:
{"data": [[1, 2, 3, 4, 5, 6, 7, 8, 9, 10], [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]]}
Important
The format of the data depends on what your score.py and model expects.
Use the following command to upload this file to the input container in the trigger storage blob created earlier. Replace
<file>
with the name of the file containing the data. Replace<triggerConnectionString>
with the connection string returned earlier. In this example,input
is the name of the input container created earlier. If you used a different name, replace this value:az storage blob upload --container-name input --file <file> --name <file> --connection-string <triggerConnectionString>
The output of this command is similar to the following JSON:
{ "etag": "\"0x8D7C21528E08844\"", "lastModified": "2020-03-06T21:27:23+00:00" }
To view the output produced by the function, use the following command to list the output files generated. Replace
<triggerConnectionString>
with the connection string returned earlier. In this example,output
is the name of the output container created earlier. If you used a different name, replace this value:az storage blob list --container-name output --connection-string <triggerConnectionString> --query '[].name' --output tsv
The output of this command is similar to
sample_input_out.json
.To download the file and inspect the contents, use the following command. Replace
<file>
with the file name returned by the previous command. Replace<triggerConnectionString>
with the connection string returned earlier:az storage blob download --container-name output --file <file> --name <file> --connection-string <triggerConnectionString>
Once the command completes, open the file. It contains the data returned by the model.
For more information on using blob triggers, see the Create a function triggered by Azure Blob storage article.
- Learn to configure your Functions App in the Functions documentation.
- Learn more about Blob storage triggers Azure Blob storage bindings.
- Deploy your model to Azure App Service.
- Consume a ML Model deployed as a web service
- API Reference