PowerShell 脚本 - 使用 Azure 数据工厂以增量方式加载数据
将源中的数据初始完全复制到接收器后,此示例 PowerShell 脚本仅从源数据存储将新的或已更新记录加载到接收器数据存储。
注意
建议使用 Azure Az PowerShell 模块与 Azure 交互。 请参阅安装 Azure PowerShell 以开始使用。 若要了解如何迁移到 Az PowerShell 模块,请参阅 将 Azure PowerShell 从 AzureRM 迁移到 Az。
本示例需要 Azure PowerShell。 运行 Get-Module -ListAvailable Az
即可查找版本。
如果需要进行安装或升级,请参阅安装 Azure PowerShell 模块。
运行 Connect-AzAccount -Environment AzureChinaCloud cmdlet 以连接到世纪互联运营的 Azure。
有关运行此示例的先决条件,请参阅教程:增量复制。
示例脚本
重要
此脚本在硬盘驱动器上的 c:\ 文件夹中创建 JSON 文件,用于定义数据工厂实体(链接服务、数据集和管道)。
# Set variables with your own values
$resourceGroupName = "<azure resource group name>" # group will be created if it does not exist already
$dataFactoryName = "<data factory name>" # must be globally unique
$dataFactoryRegion = "China East 2"
$storageAccountName = "<Az.Storage account name>"
$storageAccountKey = "<Az.Storage account key>"
$azureSqlServer = "<azure sql server name>"
$azureSqlDatabase = "<azure sql database name>"
$azureSqlUser = "<azure sql server - user name>"
$azureSqlPassword = "<azure sql server - user password>"
$outputBlobFolderPath = "<azure blob container>/<folder>" # output folder where the records are copied.
$azureStorageLinkedServiceName = "AzureStorageLinkedService"
$azureSqlDatabaseLinkedServiceName = "AzureSQLDatabaseLinkedService"
$sourceDatasetName = "SourceDataset"
$sinkDatasetName = "SinkDataset"
$sourceSqlTableName = "datasource"
$watermarkDatasetName = "WatermarkDataset"
$pipelineName = "IncrementalCopyPipeline"
# Create a resource group
New-AzResourceGroup -Name $resourceGroupName -Location $dataFactoryRegion
# Create a data factory
$df = Set-AzDataFactoryV2 -ResourceGroupName $resourceGroupName -Location $dataFactoryRegion -Name $dataFactoryName
# Create an Az.Storage linked service in the data factory
## JSON definition of the linked service.
$storageLinkedServiceDefinition = @"
{
"name": "$azureStorageLinkedServiceName",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": {
"value": "DefaultEndpointsProtocol=https;AccountName=$storageAccountName;AccountKey=$storageAccountKey;EndpointSuffix=core.chinacloudapi.cn",
"type": "SecureString"
}
}
}
}
"@
## IMPORTANT: stores the JSON definition in a file that will be used by the Set-AzDataFactoryV2LinkedService command.
$storageLinkedServiceDefinition | Out-File ./$azureStorageLinkedServiceName.json
## Creates a linked service in the data factory
Set-AzDataFactoryV2LinkedService -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$azureStorageLinkedServiceName" -File ./$azureStorageLinkedServiceName.json
## JSON definition of the linked service.
$azureSQLDatabaseLinkedServiceDefinition = @"
{
"name": "$azureSqlDatabaseLinkedService",
"properties": {
"type": "AzureSqlDatabase",
"typeProperties": {
"connectionString": {
"value": "Server = tcp:$azureSqlServer.database.chinacloudapi.cn,1433;Initial Catalog=$azureSqlDatabase; Persist Security Info=False; User ID=$azureSqlUser; Password=$azureSqlPassword; MultipleActiveResultSets = False; Encrypt = True; TrustServerCertificate = False; Connection Timeout = 30;",
"type": "SecureString"
}
}
}
}
"@
## IMPORTANT: stores the JSON definition in a file that will be used by the Set-AzDataFactoryV2LinkedService command.
$azureSQLDatabaseLinkedServiceDefinition | Out-File ./$azureSqlDatabaseLinkedServiceName.json
## Creates a linked service in the data factory
Set-AzDataFactoryV2LinkedService -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$azureSqlDatabaseLinkedServiceName" -File ./$azureSqlDatabaseLinkedServiceName.json
# Create an Azure SQL dataset in the data factory
## JSON definition of the dataset
$sourceDataset = @"
{
"name": "$sourceDatasetName",
"properties": {
"type": "AzureSqlTable",
"typeProperties": {
"tableName": "$sourceSqlTableName"
},
"linkedServiceName": {
"referenceName": "$azureSqlDatabaseLinkedServiceName",
"type": "LinkedServiceReference"
}
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Dataset command.
$sourceDataset | Out-File ./$sourceDatasetName.json
## Create a dataset in the data factory
Set-AzDataFactoryV2Dataset -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$sourceDatasetName" -File "./$sourceDatasetName.json"
# Create an Azure Blob dataset in the data factory
## JSON definition of the dataset
$sinkDataset = @"
{
"name": "$sinkDatasetName",
"properties": {
"type": "AzureBlob",
"typeProperties": {
"folderPath": "$outputBlobFolderPath",
"format": {
"type": "TextFormat"
}
},
"linkedServiceName": {
"referenceName": "$azureStorageLinkedServiceName",
"type": "LinkedServiceReference"
}
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Dataset command.
$sinkDataset | Out-File ./$sinkDatasetName.json
## Create a dataset in the data factory
Set-AzDataFactoryV2Dataset -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$sinkDatasetName" -File "./$sinkDatasetName.json"
## JSON definition of the dataset
$watermarkDataset = @"
{
"name": "$watermarkDatasetName",
"properties": {
"type": "AzureSqlTable",
"typeProperties": {
"tableName": "watermarktable"
},
"linkedServiceName": {
"referenceName": "$azureSqlDatabaseLinkedServiceName",
"type": "LinkedServiceReference"
}
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Dataset command.
$watermarkDataset | Out-File ./$watermarkDatasetName.json
## Create a dataset in the data factory
Set-AzDataFactoryV2Dataset -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$watermarkDatasetName" -File "./$watermarkDatasetName.json"
# Create a pipeline in the data factory
## JSON definition of the pipeline
$pipelineDefinition = @"
{
"name": "$pipelineName",
"properties": {
"activities": [
{
"name": "LookupWaterMarkActivity",
"type": "Lookup",
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "select * from watermarktable"
},
"dataset": {
"referenceName": "$watermarkDatasetName",
"type": "DatasetReference"
}
}
},
{
"name": "LookupMaxValuefromSourceActivity",
"type": "Lookup",
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "SELECT MAX(LastModifytime) as NewWatermarkvalue FROM dbo.datasource"
},
"dataset": {
"referenceName": "$sourceDatasetName",
"type": "DatasetReference"
}
}
},
{
"name": "IncrementalCopyActivity",
"type": "Copy",
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "select * from dbo.datasource where LastModifytime > '@{activity('LookupWaterMarkActivity').output.WatermarkValue}' and LastModifytime <= '@{activity('LookupMaxValuefromSourceActivity').output.NewWatermarkvalue}'"
},
"sink": {
"type": "BlobSink"
}
},
"dependsOn": [
{
"activity": "LookupMaxValuefromSourceActivity",
"dependencyConditions": [
"Succeeded"
]
},
{
"activity": "LookupWaterMarkActivity",
"dependencyConditions": [
"Succeeded"
]
}
],
"inputs": [
{
"referenceName": "$sourceDatasetName",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "$sinkDatasetName",
"type": "DatasetReference"
}
]
},
{
"name": "StoredProceduretoWriteWatermarkActivity",
"type": "SqlServerStoredProcedure",
"typeProperties": {
"storedProcedureName": "sp_write_watermark",
"storedProcedureParameters": {
"LastModifiedtime": {"value": "@{activity('LookupMaxValuefromSourceActivity').output.NewWatermarkvalue}", "type": "datetime" },
"TableName": { "value":"@{activity('LookupWaterMarkActivity').output.TableName}", "type":"String"}
}
},
"linkedServiceName": {
"referenceName": "$azureSqlDatabaseLinkedServiceName",
"type": "LinkedServiceReference"
},
"dependsOn": [
{
"activity": "IncrementalCopyActivity",
"dependencyConditions": [
"Succeeded"
]
}
]
}
],
}
}
"@
## IMPORTANT: store the JSON definition in a file that will be used by the Set-AzDataFactoryV2Pipeline command.
$pipelineDefinition | Out-File ./$pipelineName.json
## Create a pipeline in the data factory
Set-AzDataFactoryV2Pipeline -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -Name "$pipelineName" -File "./$pipelineName.json"
$RunId = Invoke-AzDataFactoryV2Pipeline -PipelineName "$pipelineName" -ResourceGroup $resourceGroupName -dataFactoryName $dataFactoryName
# Check the pipeline run status until it finishes the copy operation
Start-Sleep -Seconds 30
while ($True) {
$result = Get-AzDataFactoryV2ActivityRun -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName -PipelineRunId $runId -RunStartedAfter (Get-Date).AddMinutes(-30) -RunStartedBefore (Get-Date).AddMinutes(30)
if (($result | Where-Object { $_.Status -eq "InProgress" } | Measure-Object).count -ne 0) {
Write-Host "Pipeline run status: In Progress" -foregroundcolor "Yellow"
Start-Sleep -Seconds 30
}
else {
Write-Host "Pipeline '"$pipelineName"' run finished. Result:" -foregroundcolor "Yellow"
$result
break
}
}
$result = Get-AzDataFactoryV2ActivityRun -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName `
-PipelineRunId $runId `
-RunStartedAfter (Get-Date).AddMinutes(-10) `
-RunStartedBefore (Get-Date).AddMinutes(10) `
-ErrorAction Stop
$result
if ($result.Status -eq "Succeeded") {`
$result.Output -join "`r`n"`
}`
else {`
$result.Error -join "`r`n"`
}
# INSERT INTO datasource
# VALUES (6, 'newdata','9/6/2017 2:23:00 AM')
# INSERT INTO datasource
# VALUES (7, 'newdata','9/7/2017 9:01:00 AM')
# Inovke the pipeline and see that only new records are copied to the destination.
# To remove the data factory from the resource gorup
# Remove-AzDataFactoryV2 -Name $dataFactoryName -ResourceGroupName $resourceGroupName
#
# To remove the whole resource group
# Remove-AzResourceGroup -Name $resourceGroupName
清理部署
运行示例脚本后,可以使用以下命令删除资源组以及与其关联的所有资源:
Remove-AzResourceGroup -ResourceGroupName $resourceGroupName
若要从资源组中删除数据工厂,请运行以下命令:
Remove-AzDataFactoryV2 -Name $dataFactoryName -ResourceGroupName $resourceGroupName
脚本说明
此脚本使用以下命令:
命令 | 注释 |
---|---|
New-AzResourceGroup | 创建用于存储所有资源的资源组。 |
Set-AzDataFactoryV2 | 创建数据工厂。 |
Set-AzDataFactoryV2LinkedService | 在数据工厂中创建链接服务。 链接服务可将数据存储或计算链接到数据工厂。 |
Set-AzDataFactoryV2Dataset | 在数据工厂中创建数据集。 数据集表示管道中活动的输入/输出。 |
Set-AzDataFactoryV2Pipeline | 在数据工厂中创建管道。 一个管道包含一个或多个执行某项操作的活动。 在此管道中,复制活动在 Azure Blob 存储中将数据从一个位置复制到另一个位置。 |
Invoke-AzDataFactoryV2Pipeline | 为管道创建运行。 换而言之,就是运行管道。 |
Get-AzDataFactoryV2ActivityRun | 获取管道中活动的运行(活动运行)的相关详细信息。 |
Remove-AzResourceGroup | 删除资源组,包括所有嵌套的资源。 |
相关内容
有关 Azure PowerShell 的详细信息,请参阅 Azure PowerShell 文档。
可以在 Azure 数据工厂 PowerShell 脚本中找到其他 Azure 数据工厂 PowerShell 脚本示例。