Use PowerShell to create a data factory pipeline to copy data in the cloud
This sample PowerShell script creates a pipeline in Azure Data Factory that copies data from one location to another location in an Azure Blob Storage.
Note
We recommend that you use the Azure Az PowerShell module to interact with Azure. See Install Azure PowerShell to get started. To learn how to migrate to the Az PowerShell module, see Migrate Azure PowerShell from AzureRM to Az.
This sample requires Azure PowerShell. Run Get-Module -ListAvailable Az
to find the version.
If you need to install or upgrade, see Install Azure PowerShell module.
Run the Connect-AzAccount -Environment AzureChinaCloud cmdlet to connect to Azure operated by 21Vianet.
Prerequisites
- Azure Storage account. You use the blob storage as both the source and sink data stores. If you don't have an Azure storage account, see the Create a storage account on creating one.
- Create a blob container in Blob Storage, create an input folder in the container, and upload some files to the folder. You can use tools such as Azure Storage Explorer to connect to Azure Blob storage, create a blob container, upload input file, and verify the output file.
Sample script
Important
This script creates JSON files that define Data Factory entities (linked service, dataset, and pipeline) on your hard drive in the c:\ folder.
# Set variables with your own values
$resourceGroupName = "<Azure resource group name>"
$dataFactoryName = "<Data factory name>" # must be globally unquie
$dataFactoryRegion = "China East 2"
$storageAccountName = "<Az.Storage account name>"
$storageAccountKey = "<Az.Storage account key>"
$sourceBlobPath = "<Azure blob container name>/<Azure blob input folder name>" # example: adftutorial/input
$sinkBlobPath = "<Azure blob container name>/<Azure blob output folder name>" # example: adftutorial/output
$pipelineName = "CopyPipeline"
# Create a resource group
New-AzResourceGroup -Name $resourceGroupName -Location $dataFactoryRegion
# Create a data factory
$df = Set-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Location $dataFactoryRegion -Name $dataFactoryName
# Create an Az.Storage linked service in the data factory
## JSON definition of the linked service.
$storageLinkedServiceDefinition = @"
{
"name": "AzureStorageLinkedService",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": {
"value": "DefaultEndpointsProtocol=https;AccountName=$storageAccountName;AccountKey=$storageAccountKey;EndpointSuffix=core.chinacloudapi.cn",
"type": "SecureString"
}
}
}
}
"@
## IMPORTANT: stores the JSON definition in a file that will be used by the Set-AzDataFactoryV2LinkedService command.
$storageLinkedServiceDefinition | Out-File ./StorageLinkedService.json
## Creates a linked service in the data factory
Set-AzDataFactoryV2LinkedService -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "AzureStorageLinkedService" -File ./StorageLinkedService.json
# Create an Azure Blob dataset in the data factory
## JSON definition of the dataset
$datasetDefiniton = @"
{
"name": "BlobDataset",
"properties": {
"type": "AzureBlob",
"typeProperties": {
"folderPath": {
"value": "@{dataset().path}",
"type": "Expression"
}
},
"linkedServiceName": {
"referenceName": "AzureStorageLinkedService",
"type": "LinkedServiceReference"
},
"parameters": {
"path": {
"type": "String"
}
}
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Dataset command.
$datasetDefiniton | Out-File ./BlobDataset.json
## Create a dataset in the data factory
Set-AzDataFactoryV2Dataset -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "BlobDataset" -File "./BlobDataset.json"
# Create a pipeline in the data factory
## JSON definition of the pipeline
$pipelineDefinition = @"
{
"name": "$pipelineName",
"properties": {
"activities": [
{
"name": "CopyFromBlobToBlob",
"type": "Copy",
"inputs": [
{
"referenceName": "BlobDataset",
"parameters": {
"path": "@pipeline().parameters.inputPath"
},
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "BlobDataset",
"parameters": {
"path": "@pipeline().parameters.outputPath"
},
"type": "DatasetReference"
}
],
"typeProperties": {
"source": {
"type": "BlobSource"
},
"sink": {
"type": "BlobSink"
}
}
}
],
"parameters": {
"inputPath": {
"type": "String"
},
"outputPath": {
"type": "String"
}
}
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Pipeline command.
$pipelineDefinition | Out-File ./CopyPipeline.json
## Create a pipeline in the data factory
Set-AzDataFactoryV2Pipeline -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name $pipelineName -File "./CopyPipeline.json"
# Create a pipeline run
## JSON definition for pipeline parameters
$pipelineParameters = @"
{
"inputPath": "$sourceBlobPath",
"outputPath": "$sinkBlobPath"
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Invoke-AzDataFactoryV2Pipeline command.
$pipelineParameters | Out-File ./PipelineParameters.json
# Create a pipeline run by using parameters
$runId = Invoke-AzDataFactoryV2Pipeline -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -PipelineName $pipelineName -ParameterFile ./PipelineParameters.json
# Check the pipeline run status until it finishes the copy operation
while ($True) {
$result = Get-AzDataFactoryV2ActivityRun -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -PipelineRunId $runId -RunStartedAfter (Get-Date).AddMinutes(-30) -RunStartedBefore (Get-Date).AddMinutes(30)
if (($result | Where-Object { $_.Status -eq "InProgress" } | Measure-Object).count -ne 0) {
Write-Host "Pipeline run status: In Progress" -foregroundcolor "Yellow"
Start-Sleep -Seconds 30
}
else {
Write-Host "Pipeline '$pipelineName' run finished. Result:" -foregroundcolor "Yellow"
$result
break
}
}
# Get the activity run details
$result = Get-AzDataFactoryV2ActivityRun -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName `
-PipelineRunId $runId `
-RunStartedAfter (Get-Date).AddMinutes(-10) `
-RunStartedBefore (Get-Date).AddMinutes(10) `
-ErrorAction Stop
$result
if ($result.Status -eq "Succeeded") {`
$result.Output -join "`r`n"`
}`
else {`
$result.Error -join "`r`n"`
}
# To remove the data factory from the resource gorup
# Remove-AzDataFactoryV2 -Name $dataFactoryName -ResourceGroupName $resourceGroupName
#
# To remove the whole resource group
# Remove-AzResourceGroup -Name $resourceGroupName
Clean up deployment
After you run the sample script, you can use the following command to remove the resource group and all resources associated with it:
Remove-AzResourceGroup -ResourceGroupName $resourceGroupName
To remove the data factory from the resource group, run the following command:
Remove-AzDataFactoryV2 -Name $dataFactoryName -ResourceGroupName $resourceGroupName
Script explanation
This script uses the following commands:
Command | Notes |
---|---|
New-AzResourceGroup | Creates a resource group in which all resources are stored. |
Set-AzDataFactoryV2 | Create a data factory. |
Set-AzDataFactoryV2LinkedService | Creates a linked service in the data factory. A linked service links a data store or compute to a data factory. |
Set-AzDataFactoryV2Dataset | Creates a dataset in the data factory. A dataset represents input/output for an activity in a pipeline. |
Set-AzDataFactoryV2Pipeline | Creates a pipeline in the data factory. A pipeline contains one or more activities that perform a certain operation. In this pipeline, a copy activity copies data from one location to another location in an Azure Blob Storage. |
Invoke-AzDataFactoryV2Pipeline | Creates a run for the pipeline. In other words, runs the pipeline. |
Get-AzDataFactoryV2ActivityRun | Gets details about the run of the activity (activity run) in the pipeline. |
Remove-AzResourceGroup | Deletes a resource group including all nested resources. |
Related content
For more information on the Azure PowerShell, see Azure PowerShell documentation.
Additional Azure Data Factory PowerShell script samples can be found in the Azure Data Factory PowerShell samples.