Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In this quickstart, you use Azure Developer command-line tools to build a serverless workflow that orchestrates multiple tasks running in parallel. After testing the code locally, you deploy it to a new serverless function app running in a Flex Consumption plan in Azure Functions.
The project uses the Azure Developer CLI (azd) to simplify deploying your code to Azure. This deployment follows current best practices for secure and scalable Azure Functions deployments. This quickstart demonstrates the fan-out/fan-in pattern in Durable Functions, an extension that orchestrates stateful workflows with durable execution. The sample fetches article titles in parallel—the orchestration fans out to multiple activities running concurrently, then fans back in to aggregate the results.
By default, the Flex Consumption plan follows a pay-for-what-you-use billing model, which means completing this quickstart incurs a small cost of a few USD cents or less in your Azure account.
Important
A sample project for this language isn't currently available. Check back later or instead switch to one of these languages: C#, Python, or TypeScript.
Prerequisites
An Azure account with an active subscription. Create an account.
Initialize the project
Use the azd init command to create a local Durable Functions code project from a template.
In your local terminal or command prompt, run this
azd initcommand in an empty folder:azd init --template durable-functions-quickstart-dotnet-azd -e dfquickstart-dotnetThis command pulls the project files from the template repository and initializes the project in the current folder. The
-eflag sets a name for the current environment. Inazd, the environment maintains a unique deployment context for your app, and you can define more than one. The environment name is also used in the name of the resource group you create in Azure.Run this command to navigate to the
fanoutfaninapp folder:cd fanoutfaninCreate a file named local.settings.json in the
fanoutfaninfolder that contains this JSON data:{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated" } }This file is required when running locally.
In your local terminal or command prompt, run this
azd initcommand in an empty folder:azd init --template durable-functions-quickstart-python-azd -e dfquickstart-pythonThis command pulls the project files from the template repository and initializes the project in the current folder. The
-eflag sets a name for the current environment. Inazd, the environment maintains a unique deployment context for your app, and you can define more than one. The environment name is also used in the name of the resource group you create in Azure.Run this command to navigate to the
srcapp folder:cd srcCreate a file named local.settings.json in the
srcfolder that contains this JSON data:{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "python" } }This file is required when running locally.
Create and activate a virtual environment
In the src folder, run these commands to create and activate a virtual environment named .venv:
python3 -m venv .venv
source .venv/bin/activate
If Python doesn't install the venv package on your Linux distribution, run the following command:
sudo apt-get install python3-venv
Install Python dependencies
From the src folder with the virtual environment activated, run this command to install the required dependencies:
pip install -r requirements.txt
In your local terminal or command prompt, run this
azd initcommand in an empty folder:azd init --template durable-functions-quickstart-typescript-azd -e dfquickstart-typescriptThis command pulls the project files from the template repository and initializes the project in the current folder. The
-eflag sets a name for the current environment. Inazd, the environment maintains a unique deployment context for your app, and you can define more than one. The environment name is also used in the name of the resource group you create in Azure.Run this command to navigate to the
srcapp folder:cd srcCreate a file named local.settings.json in the
srcfolder that contains this JSON data:{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "node" } }This file is required when running locally.
Install dependencies
From the src folder, run these commands to install the required dependencies and build the project:
npm install
npm run build
Start Azurite
The Functions runtime needs a storage component. The "AzureWebJobsStorage": "UseDevelopmentStorage=true" setting in the local.settings.json file directs the runtime to use the local storage emulator, Azurite.
Run this command to start Azurite:
npx azurite --skipApiVersionCheck --location ~/azurite-data
Keep Azurite running in the terminal window. You need it running while testing locally.
Run in your local environment
From the
fanoutfaninfolder in a new terminal window, run this command to start the Functions host:func startWhen the Functions host starts in your local project folder, it writes the local URL endpoints of your HTTP triggered functions to the terminal output.
Note
Because access key authorization isn't enforced when running locally, you don't need an access key to call your function.
In your browser, make a GET request to the endpoint that starts the orchestration:
http://localhost:7071/api/FetchOrchestration_HttpStart
This request starts a new orchestration instance. The orchestration fans out to several activities to fetch the titles of Microsoft Learn articles in parallel. When the activities finish, the orchestration fans back in and returns the titles as a formatted string.
From the
srcfolder in a new terminal window with the virtual environment activated, run this command to start the Functions host:func startWhen the Functions host starts in your local project folder, it writes the local URL endpoints of your HTTP triggered functions to the terminal output.
Note
Because access key authorization isn't enforced when running locally, you don't need an access key to call your function.
In your browser, make a GET request to the HTTP start endpoint:
http://localhost:7071/api/orchestrators/fetch_orchestration
This request starts a new orchestration instance. The orchestration fans out to several activities to fetch the titles of Microsoft Learn articles in parallel. When the activities finish, the orchestration fans back in and returns the titles as a formatted string.
From the
srcfolder in a new terminal window with the virtual environment activated, run this command to start the Functions host:func startWhen the Functions host starts in your local project folder, it writes the local URL endpoints of your HTTP triggered functions to the terminal output.
Note
Because access key authorization isn't enforced when running locally, you don't need an access key to call your function.
In your browser, make a GET request to the HTTP start endpoint:
http://localhost:7071/api/orchestrators/fetchOrchestration
This request starts a new orchestration instance. The orchestration fans out to several activities to fetch the titles of Microsoft Learn articles in parallel. When the activities finish, the orchestration fans back in and returns the titles as a formatted string.
The HTTP endpoint returns a JSON response with several URLs. The
statusQueryGetUriendpoint provides the orchestration status.Copy the
statusQueryGetUrivalue and paste it into your browser or HTTP test tool to check the status of the orchestration. When the orchestration completes, you see the fetched article titles in the response.When you're done, press Ctrl+C in the terminal window to stop the
funchost process.
- Run
deactivateto shut down the virtual environment.
Review the code (optional)
You can review the code that implements the fan-out/fan-in pattern:
The title fetching activities are tracked using a dynamic task list. The line await Task.WhenAll(parallelTasks); waits for all the called activities, which run concurrently, to complete. When done, all outputs are aggregated as a formatted string.
[Function(nameof(FetchOrchestration))]
public static async Task<string> RunOrchestrator(
[OrchestrationTrigger] TaskOrchestrationContext context)
{
ILogger logger = context.CreateReplaySafeLogger(nameof(FetchOrchestration));
logger.LogInformation("Fetching data.");
var parallelTasks = new List<Task<string>>();
// List of URLs to fetch titles from
var urls = new List<string>
{
"/azure-functions/durable/durable-functions-overview",
"/azure-functions/durable/durable-task-scheduler/durable-task-scheduler",
"/azure-functions/functions-scenarios",
"/azure-functions/functions-create-ai-enabled-apps",
};
// Run fetching tasks in parallel
foreach (var url in urls)
{
Task<string> task = context.CallActivityAsync<string>(nameof(FetchTitleAsync), url);
parallelTasks.Add(task);
}
// Wait for all the parallel tasks to complete before continuing
await Task.WhenAll(parallelTasks);
// Return fetched titles as a formatted string
return string.Join("; ", parallelTasks.Select(t => t.Result));
}
You can review the complete template project here.
The title fetching activities are tracked using a dynamic task list. The line yield context.task_all(tasks) waits for all the called activities, which run concurrently, to complete. When done, all outputs are aggregated as a formatted string.
# List of URLs to fetch titles from
urls = [
"/azure-functions/durable/durable-functions-overview",
"/azure-functions/durable/durable-task-scheduler/durable-task-scheduler",
"/azure-functions/functions-scenarios",
"/azure-functions/functions-create-ai-enabled-apps",
]
# Run fetching tasks in parallel
tasks = []
for url in urls:
task = context.call_activity("fetch_title", url)
tasks.append(task)
# Wait for all the parallel tasks to complete before continuing
results = yield context.task_all(tasks)
# Return fetched titles as a formatted string
return "; ".join(results)
@myApp.activity_trigger(input_name="url")
async def fetch_title(url: str):
"""Activity function that fetches the title from a URL."""
logger = logging.getLogger("FetchTitle")
logger.info(f"Fetching from url {url}.")
try:
async with ClientSession() as session:
You can review the complete template project here.
The title fetching activities are tracked using a dynamic task list. The line yield context.df.Task.all(parallelTasks) waits for all the called activities, which run concurrently, to complete. When done, all outputs are aggregated as a formatted string.
const urls = [
"/azure-functions/durable/durable-functions-overview",
"/azure-functions/durable/durable-task-scheduler/durable-task-scheduler",
"/azure-functions/functions-scenarios",
"/azure-functions/functions-create-ai-enabled-apps",
];
// Run fetching tasks in parallel
const parallelTasks = [];
for (const url of urls) {
const task = context.df.callActivity(fetchTitleActivityName, url);
parallelTasks.push(task);
}
// Wait for all the parallel tasks to complete before continuing
const results: string[] = yield context.df.Task.all(parallelTasks);
// Return fetched titles as a formatted string
return results.join("; ");
};
df.app.orchestration("fetchOrchestration", fetchOrchestration);
const fetchTitleAsync: ActivityHandler = async function (
url: string,
context: InvocationContext
You can review the complete template project here.
After you verify your functions locally, it's time to publish them to Azure.
Deploy to Azure
This project is configured to use the azd up command, run from the project root folder, to deploy this project to a new function app in a Flex Consumption plan in Azure.
Tip
The project includes a set of Bicep files (in the infra folder) that azd uses to create a secure deployment to a Flex consumption plan that follows best practices.
In the project root folder, which contains the azure.yaml file, run this command to authenticate with your Azure account:
azd auth loginFor this quickstart, run this command to disable the virtual network deployment:
azd env set VNET_ENABLED falseRun this command from the root project folder to have
azdcreate the required Azure resources and deploy your code project to the new function app:azd upThe root folder contains the
azure.yamldefinition file required byazd.When prompted, provide these required deployment parameters:
Parameter Description Azure subscription Subscription in which your resources are created. Azure location Azure region in which to create the resource group that contains the new Azure resources. Only regions that currently support the Flex Consumption plan are shown. The
azd upcommand uses your responses to these prompts with the Bicep configuration files to complete these deployment tasks:Create and configure these required Azure resources:
- Flex Consumption plan and function app
- Azure Storage (required) and Application Insights (recommended)
- Access policies and roles for your account
- Service-to-service connections using managed identities (instead of stored connection strings)
Package and deploy your code to the deployment container. The app is then started and runs in the deployed package.
After the command completes successfully, you see links to the resources you created.
Invoke the function on Azure
You can now invoke your orchestration endpoint in Azure by making an HTTP request to its URL. When your functions run in Azure, access key authorization is enforced, and you must provide a function access key with your request.
You can use the Core Tools to get the URL endpoint of the HTTP trigger that starts the orchestration in Azure.
In your local terminal or command prompt, run these commands to get the URL endpoint values:
APP_NAME=$(azd env get-value AZURE_FUNCTION_NAME) func azure functionapp list-functions $APP_NAME --show-keysThe
azd env get-valuecommand gets your function app name from the local environment. When you use the--show-keysoption withfunc azure functionapp list-functions, the returned Invoke URL: value for each endpoint includes any required function-level access keys.As before, use a browser or HTTP test tool to start the orchestration in your function app running in Azure.
Redeploy your code
Run the azd up command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
Note
Deployed code files are always overwritten by the latest deployment package.
Your initial responses to azd prompts and any environment variables generated by azd are stored locally in your named environment. Use the azd env get-values command to review all of the variables in your environment that you used when creating Azure resources.
Clean up resources
When you're done working with your function app and related resources, use this command to delete the function app and its related resources from Azure and avoid incurring any further costs:
azd down --no-prompt
Note
The --no-prompt option instructs azd to delete your resource group without a confirmation from you.
This command doesn't affect your local code project.