使用 Azure 数据工厂从/向 SFTP 服务器复制数据Copy data from and to the SFTP server by using Azure Data Factory

适用于: Azure 数据工厂 Azure Synapse Analytics(预览版)

本文概述了如何从/向安全 FTP (SFTP) 服务器复制数据。This article outlines how to copy data from and to the secure FTP (SFTP) server. 若要了解 Azure 数据工厂,请阅读介绍性文章To learn about Azure Data Factory, read the introductory article.

支持的功能Supported capabilities

以下活动支持 SFTP 连接器:The SFTP connector is supported for the following activities:

具体而言,SFTP 连接器支持:Specifically, the SFTP connector supports:

先决条件Prerequisites

如果数据存储位于本地网络、Azure 虚拟网络或 Amazon Virtual Private Cloud 内部,则需要设置自承载集成运行时才能连接到该数据存储。If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to set up a self-hosted integration runtime to connect to it.

如果数据存储是托管的云数据服务,则可以使用 Azure 集成运行时。If your data store is a managed cloud data service, you can use Azure integration runtime. 如果访问范围限制为防火墙规则中列入白名单的 IP,可以选择将 Azure 集成运行时 IP 添加到允许列表。If the access is restricted to IPs that are whitelisted in the firewall rules, you can choose to add Azure Integration Runtime IPs into the allow list.

要详细了解网络安全机制和数据工厂支持的选项,请参阅数据访问策略For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.

入门Get started

若要使用管道执行复制活动,可以使用以下工具或 SDK 之一:To perform the Copy activity with a pipeline, you can use one of the following tools or SDKs:

对于特定于 SFTP 的数据工厂实体,以下部分提供有关用于定义这些实体的属性的详细信息。The following sections provide details about properties that are used to define Data Factory entities specific to SFTP.

链接服务属性Linked service properties

SFTP 链接服务支持以下属性:The following properties are supported for the SFTP linked service:

属性Property 说明Description 必须Required
typetype type 属性必须设置为 Sftp。The type property must be set to Sftp . Yes
hosthost SFTP 服务器的名称或 IP 地址。The name or IP address of the SFTP server. Yes
portport SFTP 服务器侦听的端口。The port on which the SFTP server is listening.
允许的值为整数,默认值为 22。The allowed value is an integer, and the default value is 22 .
No
skipHostKeyValidationskipHostKeyValidation 指定是否要跳过主机密钥验证。Specify whether to skip host key validation.
允许的值为 truetrue (默认值)。Allowed values are true and false (default).
No
hostKeyFingerprinthostKeyFingerprint 指定主机密钥的指纹。Specify the fingerprint of the host key. 是(如果“skipHostKeyValidation”设置为 false)。Yes, if the "skipHostKeyValidation" is set to false.
authenticationTypeauthenticationType 指定身份验证类型。Specify the authentication type.
允许的值为 Basic 和 SshPublicKey 。Allowed values are Basic and SshPublicKey . 有关更多属性,请参阅使用基本身份验证部分。For more properties, see the Use basic authentication section. 有关 JSON 示例,请参阅使用 SSH 公钥身份验证部分。For JSON examples, see the Use SSH public key authentication section.
Yes
connectViaconnectVia 用于连接到数据存储的集成运行时The integration runtime to be used to connect to the data store. 若要了解详细信息,请参阅先决条件部分。To learn more, see the Prerequisites section. 如果未指定集成运行时,服务会使用默认的 Azure Integration Runtime。If the integration runtime isn't specified, the service uses the default Azure Integration Runtime. No

使用基本身份验证Use basic authentication

若要使用基本身份验证,请将“authenticationType”属性设置为“Basic” ,并指定下列属性以及上一部分介绍的 SFTP 连接器泛型属性:To use basic authentication, set the authenticationType property to Basic , and specify the following properties in addition to the SFTP connector generic properties that were introduced in the preceding section:

属性Property 说明Description 必须Required
userNameuserName 有权访问 SFTP 服务器的用户。The user who has access to the SFTP server. Yes
passwordpassword 用户 (userName) 的密码。The password for the user (userName). 将此字段标记为 SecureString 以安全地将其存储在数据工厂中,或引用存储在 Azure Key Vault 中的机密Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault. Yes

示例:Example:

{
    "name": "SftpLinkedService",
    "type": "linkedservices",
    "properties": {
        "type": "Sftp",
        "typeProperties": {
            "host": "<sftp server>",
            "port": 22,
            "skipHostKeyValidation": false,
            "hostKeyFingerPrint": "ssh-rsa 2048 xx:00:00:00:xx:00:x0:0x:0x:0x:0x:00:00:x0:x0:00",
            "authenticationType": "Basic",
            "userName": "<username>",
            "password": {
                "type": "SecureString",
                "value": "<password>"
            }
        },
        "connectVia": {
            "referenceName": "<name of integration runtime>",
            "type": "IntegrationRuntimeReference"
        }
    }
}

使用 SSH 公钥身份验证Use SSH public key authentication

要使用 SSH 公钥身份验证,请将“authenticationType”属性设置为“SshPublicKey”,并指定除上一部分所述 SFTP 连接器泛型属性以外的下列属性:To use SSH public key authentication, set "authenticationType" property as SshPublicKey , and specify the following properties besides the SFTP connector generic ones introduced in the last section:

属性Property 说明Description 必须Required
userNameuserName 有权访问 SFTP 服务器的用户。The user who has access to the SFTP server. Yes
privateKeyPathprivateKeyPath 指定集成运行时可以访问的私钥文件的绝对路径。Specify the absolute path to the private key file that the integration runtime can access. 只有在“connectVia”中指定了自承载类型的集成运行时的情况下,此项才适用。This applies only when the self-hosted type of integration runtime is specified in "connectVia." 指定 privateKeyPathprivateKeyContentSpecify either privateKeyPath or privateKeyContent.
privateKeyContentprivateKeyContent Base64 编码的 SSH 私钥内容。Base64 encoded SSH private key content. SSH 私钥应采用 OpenSSH 格式。SSH private key should be OpenSSH format. 将此字段标记为 SecureString 以安全地将其存储在数据工厂中,或引用存储在 Azure Key Vault 中的机密Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault. 指定 privateKeyPathprivateKeyContentSpecify either privateKeyPath or privateKeyContent.
passPhrasepassPhrase 如果密钥文件或密钥内容受通行短语保护,请指定用于解密私钥的通行短语或密码。Specify the pass phrase or password to decrypt the private key if the key file or the key content is protected by a pass phrase. 将此字段标记为 SecureString 以安全地将其存储在数据工厂中,或引用存储在 Azure Key Vault 中的机密Mark this field as a SecureString to store it securely in your data factory, or reference a secret stored in an Azure key vault. 是(如果私钥文件或密钥内容受通行短语的保护)。Yes, if the private key file or the key content is protected by a pass phrase.

备注

SFTP 连接器支持 RSA/DSA OpenSSH 密钥。The SFTP connector supports an RSA/DSA OpenSSH key. 请确保密钥文件内容以“-----BEGIN [RSA/DSA] PRIVATE KEY-----”开头。Make sure that your key file content starts with "-----BEGIN [RSA/DSA] PRIVATE KEY-----". 如果私钥文件是 PPK 格式的文件,请使用 PuTTY 工具将其从 PPK 格式转换为 OpenSSH 格式。If the private key file is a PPK-format file, use the PuTTY tool to convert from PPK to OpenSSH format.

示例 1:使用私钥文件路径进行 SshPublicKey 身份验证Example 1: SshPublicKey authentication using private key filePath

{
    "name": "SftpLinkedService",
    "type": "Linkedservices",
    "properties": {
        "type": "Sftp",
        "typeProperties": {
            "host": "<sftp server>",
            "port": 22,
            "skipHostKeyValidation": true,
            "authenticationType": "SshPublicKey",
            "userName": "xxx",
            "privateKeyPath": "D:\\privatekey_openssh",
            "passPhrase": {
                "type": "SecureString",
                "value": "<pass phrase>"
            }
        },
        "connectVia": {
            "referenceName": "<name of integration runtime>",
            "type": "IntegrationRuntimeReference"
        }
    }
}

示例 2:使用私钥内容进行 SshPublicKey 身份验证Example 2: SshPublicKey authentication using private key content

{
    "name": "SftpLinkedService",
    "type": "Linkedservices",
    "properties": {
        "type": "Sftp",
        "typeProperties": {
            "host": "<sftp server>",
            "port": 22,
            "skipHostKeyValidation": true,
            "authenticationType": "SshPublicKey",
            "userName": "<username>",
            "privateKeyContent": {
                "type": "SecureString",
                "value": "<base64 string of the private key content>"
            },
            "passPhrase": {
                "type": "SecureString",
                "value": "<pass phrase>"
            }
        },
        "connectVia": {
            "referenceName": "<name of integration runtime>",
            "type": "IntegrationRuntimeReference"
        }
    }
}

数据集属性Dataset properties

有关可用于定义数据集的各部分和属性的完整列表,请参阅数据集一文。For a full list of sections and properties that are available for defining datasets, see the Datasets article.

Azure 数据工厂支持以下文件格式。Azure Data Factory supports the following file formats. 请参阅每一篇介绍基于格式的设置的文章。Refer to each article for format-based settings.

SFTP 支持基于格式的数据集中 location 设置下的以下属性:The following properties are supported for SFTP under location settings in the format-based dataset:

属性Property 说明Description 必须Required
typetype 数据集中 location 下的 type 属性必须设置为 SftpLocation。The type property under location in dataset must be set to SftpLocation . Yes
folderPathfolderPath 文件夹的路径。The path to the folder. 如果要使用通配符来筛选文件夹,请跳过此设置并在活动源设置中指定路径。If you want to use a wildcard to filter the folder, skip this setting and specify the path in activity source settings. No
fileNamefileName 指定的 folderPath 下的文件名。The file name under the specified folderPath. 如果要使用通配符来筛选文件,请跳过此设置并在活动源设置中指定文件名。If you want to use a wildcard to filter files, skip this setting and specify the file name in activity source settings. No

示例:Example:

{
    "name": "DelimitedTextDataset",
    "properties": {
        "type": "DelimitedText",
        "linkedServiceName": {
            "referenceName": "<SFTP linked service name>",
            "type": "LinkedServiceReference"
        },
        "schema": [ < physical schema, optional, auto retrieved during authoring > ],
        "typeProperties": {
            "location": {
                "type": "SftpLocation",
                "folderPath": "root/folder/subfolder"
            },
            "columnDelimiter": ",",
            "quoteChar": "\"",
            "firstRowAsHeader": true,
            "compressionCodec": "gzip"
        }
    }
}

复制活动属性Copy activity properties

有关可用于定义活动的各部分和属性的完整列表,请参阅管道一文。For a full list of sections and properties that are available for defining activities, see the Pipelines article. 本部分提供 SFTP 源支持的属性列表。This section provides a list of properties that are supported by the SFTP source.

以 SFTP 作为源SFTP as source

Azure 数据工厂支持以下文件格式。Azure Data Factory supports the following file formats. 请参阅每一篇介绍基于格式的设置的文章。Refer to each article for format-based settings.

SFTP 支持基于格式的复制源中 storeSettings 设置下的以下属性:The following properties are supported for SFTP under the storeSettings settings in the format-based Copy source:

属性Property 说明Description 必须Required
typetype storeSettings 下的 type 属性必须设置为 SftpReadSettings。The type property under storeSettings must be set to SftpReadSettings . Yes
找到要复制的文件Locate the files to copy
选项 1:静态路径OPTION 1: static path
从数据集中指定的文件夹/文件路径复制。Copy from the folder/file path that's specified in the dataset. 若要复制文件夹中的所有文件,请另外将 wildcardFileName 指定为 *If you want to copy all files from a folder, additionally specify wildcardFileName as *.
选项 2:通配符OPTION 2: wildcard
- wildcardFolderPath- wildcardFolderPath
带有通配符的文件夹路径,用于筛选源文件夹。The folder path with wildcard characters to filter source folders.
允许的通配符为 *(匹配零个或零个以上的字符)和 ?(匹配零个或单个字符);如果实际文件夹名称中包含通配符或此转义字符,请使用 ^ 进行转义。Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual folder name has a wildcard or this escape char inside.
如需更多示例,请参阅文件夹和文件筛选器示例For more examples, see Folder and file filter examples.
No
选项 2:通配符OPTION 2: wildcard
- wildcardFileName- wildcardFileName
指定的 folderPath/wildcardFolderPath 下带有通配符的文件名,用于筛选源文件。The file name with wildcard characters under the specified folderPath/wildcardFolderPath to filter source files.
允许的通配符为 *(匹配零个或零个以上的字符)和 ?(匹配零个或单个字符);如果实际文件夹名称中包含通配符或此转义字符,请使用 ^ 进行转义。Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual folder name has wildcard or this escape char inside. 如需更多示例,请参阅文件夹和文件筛选器示例For more examples, see Folder and file filter examples.
Yes
选项 3:文件列表OPTION 3: a list of files
- fileListPath- fileListPath
表示要复制指定文件集。Indicates to copy a specified file set. 指向一个文本文件,其中包含要复制的文件列表(每行一个文件,带有数据集中所配置路径的相对路径)。Point to a text file that includes a list of files you want to copy (one file per line, with the relative path to the path configured in the dataset).
使用此选项时,请不要在数据集中指定文件名。When you use this option, don't specify the file name in the dataset. 如需更多示例,请参阅文件列表示例For more examples, see File list examples.
No
其他设置Additional settings
recursiverecursive 指示是要从子文件夹中以递归方式读取数据,还是只从指定的文件夹中读取数据。Indicates whether the data is read recursively from the subfolders or only from the specified folder. 当 recursive 设置为 true 且接收器是基于文件的存储时,将不会在接收器上复制或创建空的文件夹或子文件夹。When recursive is set to true and the sink is a file-based store, an empty folder or subfolder isn't copied or created at the sink.
允许的值为 true (默认值)和 falseAllowed values are true (default) and false .
如果配置 fileListPath,则此属性不适用。This property doesn't apply when you configure fileListPath.
No
deleteFilesAfterCompletiondeleteFilesAfterCompletion 指示是否会在二进制文件成功移到目标存储后将其从源存储中删除。Indicates whether the binary files will be deleted from source store after successfully moving to the destination store. 文件删除按文件进行。因此,当复制活动失败时,你会看到一些文件已经复制到目标并从源中删除,而另一些文件仍保留在源存储中。The file deletion is per file, so when copy activity fails, you will see some files have already been copied to the destination and deleted from source, while others are still remaining on source store.
此属性仅在二进制文件复制方案中有效。This property is only valid in binary files copy scenario. 默认值:false。The default value: false.
No
modifiedDatetimeStartmodifiedDatetimeStart 文件根据“上次修改时间”属性进行筛选。Files are filtered based on the attribute Last Modified .
如果文件的上次修改时间在 modifiedDatetimeStartmodifiedDatetimeEnd 之间的范围内,则会选中这些文件。The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. 该时间应用于 UTC 时区,格式为“2018-12-01T05:00:00Z”。The time is applied to the UTC time zone in the format of 2018-12-01T05:00:00Z .
属性可以为 NULL,这意味着不向数据集应用任何文件特性筛选器。The properties can be NULL, which means that no file attribute filter is applied to the dataset. 如果 modifiedDatetimeStart 具有日期/时间值,但 modifiedDatetimeEnd 为 NULL,则意味着将选中“上次修改时间”属性大于或等于该日期/时间值的文件。When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. 如果 modifiedDatetimeEnd 具有日期/时间值,但 modifiedDatetimeStart 为 NULL,则意味着将选中“上次修改时间”属性小于该日期/时间值的文件。When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
如果配置 fileListPath,则此属性不适用。This property doesn't apply when you configure fileListPath.
No
modifiedDatetimeEndmodifiedDatetimeEnd 同上。Same as above. No
enablePartitionDiscoveryenablePartitionDiscovery 对于已分区的文件,请指定是否从文件路径分析分区,并将它们添加为附加的源列。For files that are partitioned, specify whether to parse the partitions from the file path and add them as additional source columns.
允许的值为 false(默认值)和 true 。Allowed values are false (default) and true .
No
partitionRootPathpartitionRootPath 启用分区发现时,请指定绝对根路径,以便将已分区文件夹读取为数据列。When partition discovery is enabled, specify the absolute root path in order to read partitioned folders as data columns.

如果未指定,默认情况下,If it is not specified, by default,
- 在数据集或源的文件列表中使用文件路径时,分区根路径是在数据集中配置的路径。- When you use file path in dataset or list of files on source, partition root path is the path configured in dataset.
- 使用通配符文件夹筛选器时,分区根路径是第一个通配符前的子路径。- When you use wildcard folder filter, partition root path is the sub-path before the first wildcard.

例如,假设你将数据集中的路径配置为“root/folder/year=2020/month=08/day=27”:For example, assuming you configure the path in dataset as "root/folder/year=2020/month=08/day=27":
- 如果将分区根路径指定为“root/folder/year=2020”,则复制活动除了文件内的列外,还将生成另外两个列“month”和“day”,其值分别为“08”和“27”。- If you specify partition root path as "root/folder/year=2020", copy activity will generate two more columns month and day with value "08" and "27" respectively, in addition to the columns inside the files.
- 如果未指定分区根路径,则不会生成额外的列。- If partition root path is not specified, no extra column will be generated.
No
maxConcurrentConnectionsmaxConcurrentConnections 可以同时连接到存储区存储的连接数。The number of connections that can connect to the storage store concurrently. 仅在要限制与数据存储的并发连接时指定一个值。Specify a value only when you want to limit the concurrent connection to the data store. No

示例:Example:

"activities":[
    {
        "name": "CopyFromSFTP",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<Delimited text input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "DelimitedTextSource",
                "formatSettings":{
                    "type": "DelimitedTextReadSettings",
                    "skipLineCount": 10
                },
                "storeSettings":{
                    "type": "SftpReadSettings",
                    "recursive": true,
                    "wildcardFolderPath": "myfolder*A",
                    "wildcardFileName": "*.csv"
                }
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

以 SFTP 作为接收器SFTP as a sink

Azure 数据工厂支持以下文件格式。Azure Data Factory supports the following file formats. 请参阅每一篇介绍基于格式的设置的文章。Refer to each article for format-based settings.

SFTP 支持基于格式的复制接收器中 storeSettings 设置下的以下属性:The following properties are supported for SFTP under storeSettings settings in a format-based Copy sink:

属性Property 说明Description 必需Required
typetype storeSettings 下的 type 属性必须设置为 SftpWriteSettings。The type property under storeSettings must be set to SftpWriteSettings . Yes
copyBehaviorcopyBehavior 定义以基于文件的数据存储中的文件为源时的复制行为。Defines the copy behavior when the source is files from a file-based data store.

允许值包括:Allowed values are:
- PreserveHierarchy(默认):将文件层次结构保留到目标文件夹中。- PreserveHierarchy (default): Preserves the file hierarchy in the target folder. 指向源文件夹的源文件相对路径与指向目标文件夹的目标文件相对路径相同。The relative path of the source file to the source folder is identical to the relative path of the target file to the target folder.
- FlattenHierarchy:源文件夹中的所有文件都位于目标文件夹的第一级中。- FlattenHierarchy: All files from the source folder are in the first level of the target folder. 目标文件具有自动生成的名称。The target files have autogenerated names.
- MergeFiles:将源文件夹中的所有文件合并到一个文件中。- MergeFiles: Merges all files from the source folder to one file. 如果指定了文件名,则合并文件的名称为指定名称。If the file name is specified, the merged file name is the specified name. 否则,它是自动生成的文件名。Otherwise, it's an autogenerated file name.
No
maxConcurrentConnectionsmaxConcurrentConnections 可以同时连接到存储区存储的连接数。The number of connections that can connect to the storage store concurrently. 仅在要限制与数据存储的并发连接时指定一个值。Specify a value only when you want to limit the concurrent connection to the data store. No
useTempFileRenameuseTempFileRename 指示是将其上传到临时文件并重命名,还是将其直接写入到目标文件夹或文件位置。Indicate whether to upload to temporary files and rename them, or directly write to the target folder or file location. 默认情况下,Azure 数据工厂先将数据写入到临时文件,然后在上传完成时重命名文件。By default, Azure Data Factory first writes to temporary files and then renames them when the upload is finished. 采取此顺序有助于 (1) 避免可能会导致文件损坏的冲突(如果有其他进程对同一文件进行写入操作);(2) 在整个传输过程中确保文件的原始版本存在。This sequence helps to (1) avoid conflicts that might result in a corrupted file if you have other processes writing to the same file, and (2) ensure that the original version of the file exists during the transfer. 如果 SFTP 服务器不支持重命名操作,请禁用此选项,并确保不会对目标文件进行并发写入操作。If your SFTP server doesn't support a rename operation, disable this option and make sure that you don't have a concurrent write to the target file. 有关详细信息,请查看此表末尾的故障排除提示。For more information, see the troubleshooting tip at the end of this table. 否。No. 默认值为 trueDefault value is true .
operationTimeoutoperationTimeout 每个对 SFTP 服务器的写入请求超时之前的等待时间。默认值为 60 分钟 (01:00:00)。The wait time before each write request to SFTP server times out. Default value is 60 min (01:00:00). No

提示

如果将数据写入 SFTP 时遇到“UserErrorSftpPathNotFound”、“UserErrorSftpPermissionDenied”或“SftpOperationFail”错误,并且你使用的 SFTP 用户确实有适当的权限,请查看 SFTP 服务器是否支持文件重命名操作。If you receive the error "UserErrorSftpPathNotFound," "UserErrorSftpPermissionDenied," or "SftpOperationFail" when you're writing data into SFTP, and the SFTP user you use does have the proper permissions, check to see whether your SFTP server support file rename operation is working. 如果不支持,请禁用“使用临时文件上传”(useTempFileRename) 选项,然后重试。If it isn't, disable the Upload with temp file (useTempFileRename) option and try again. 若要详细了解此属性,请查看上面的表。To learn more about this property, see the preceding table. 如果将自承载集成运行时用于执行复制活动,请确保使用 4.6 或更高版本。If you use a self-hosted integration runtime for the Copy activity, be sure to use version 4.6 or later.

示例:Example:

"activities":[
    {
        "name": "CopyToSFTP",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "<source type>"
            },
            "sink": {
                "type": "BinarySink",
                "storeSettings":{
                    "type": "SftpWriteSettings",
                    "copyBehavior": "PreserveHierarchy"
                }
            }
        }
    }
]

文件夹和文件筛选器示例Folder and file filter examples

本部分介绍对文件夹路径和文件名使用通配符筛选器后产生的行为。This section describes the behavior that results from using wildcard filters with folder paths and file names.

folderPathfolderPath fileNamefileName recursiverecursive 源文件夹结构和筛选器结果(用 粗体 表示的文件已检索)Source folder structure and filter result (files in bold are retrieved)
Folder* (为空,使用默认值)(empty, use default) falsefalse FolderAFolderA
    File1.csv    File1.csv
    File2.json    File2.json
    Subfolder1    Subfolder1
        File3.csv        File3.csv
        File4.json        File4.json
        File5.csv        File5.csv
AnotherFolderBAnotherFolderB
    File6.csv    File6.csv
Folder* (为空,使用默认值)(empty, use default) true FolderAFolderA
    File1.csv    File1.csv
    File2.json    File2.json
    Subfolder1    Subfolder1
        File3.csv        File3.csv
        File4.json        File4.json
        File5.csv        File5.csv
AnotherFolderBAnotherFolderB
    File6.csv    File6.csv
Folder* *.csv falsefalse FolderAFolderA
    File1.csv    File1.csv
    File2.json    File2.json
    Subfolder1    Subfolder1
        File3.csv        File3.csv
        File4.json        File4.json
        File5.csv        File5.csv
AnotherFolderBAnotherFolderB
    File6.csv    File6.csv
Folder* *.csv true FolderAFolderA
    File1.csv    File1.csv
    File2.json    File2.json
    Subfolder1    Subfolder1
        File3.csv        File3.csv
        File4.json        File4.json
        File5.csv        File5.csv
AnotherFolderBAnotherFolderB
    File6.csv    File6.csv

文件列表示例File list examples

此表介绍了在复制活动源中使用文件列表路径时产生的行为。This table describes the behavior that results from using a file list path in the Copy activity source. 假设有以下源文件夹结构,并且要复制粗体类型的文件:It assumes that you have the following source folder structure and want to copy the files that are in bold type:

示例源结构Sample source structure FileListToCopy.txt 中的内容Content in FileListToCopy.txt Azure 数据工厂配置Azure Data Factory configuration
rootroot
    FolderA    FolderA
        File1.csv        File1.csv
        File2.json        File2.json
        Subfolder1        Subfolder1
            File3.csv            File3.csv
            File4.json            File4.json
            File5.csv            File5.csv
    元数据    Metadata
        FileListToCopy.txt        FileListToCopy.txt
File1.csvFile1.csv
Subfolder1/File3.csvSubfolder1/File3.csv
Subfolder1/File5.csvSubfolder1/File5.csv
在数据集中:In the dataset:
- 文件夹路径:root/FolderA- Folder path: root/FolderA

在复制活动源中:In the Copy activity source:
- 文件列表路径:root/Metadata/FileListToCopy.txt- File list path: root/Metadata/FileListToCopy.txt

文件列表路径指向同一数据存储中的一个文本文件,该文件包含要复制的文件列表(每行一个文件,带有数据集中所配置路径的相对路径)。The file list path points to a text file in the same data store that includes a list of files you want to copy (one file per line, with the relative path to the path configured in the dataset).

查找活动属性Lookup activity properties

有关查找活动属性的信息,请参阅 Azure 数据工厂中的查找活动For information about Lookup activity properties, see Lookup activity in Azure Data Factory.

GetMetadata 活动属性GetMetadata activity properties

有关 GetMetadata 活动属性的信息,请参阅 Azure 数据工厂中的 GetMetadata 活动For information about GetMetadata activity properties, see GetMetadata activity in Azure Data Factory.

Delete 活动属性Delete activity properties

有关删除活动属性的信息,请参阅 Azure 数据工厂中的删除活动For information about Delete activity properties, see Delete activity in Azure Data Factory.

旧模型Legacy models

备注

仍会按原样支持以下模型,以实现后向兼容性。The following models are still supported as is for backward compatibility. 建议你使用前面讨论的新模型,因为 Azure 数据工厂创作 UI 已切换到生成新模型。We recommend that you use the previously discussed new model, because the Azure Data Factory authoring UI has switched to generating the new model.

旧数据集模型Legacy dataset model

propertiesProperty 说明Description 必需Required
typetype 数据集的 type 属性必须设置为 FileShare。The type property of the dataset must be set to FileShare . Yes
folderPathfolderPath 文件夹的路径。The path to the folder. 支持通配符筛选器。A wildcard filter is supported. 允许的通配符为 *(匹配零个或零个以上的字符)和 ?(匹配零个或单个字符);如果实际文件名称中包含通配符或此转义字符,请使用 ^ 进行转义。Allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character); use ^ to escape if your actual file name has a wildcard or this escape char inside.

示例:“rootfolder/subfolder/”,请参阅文件夹和文件筛选器示例中的更多示例。Examples: rootfolder/subfolder/, see more examples in Folder and file filter examples.
Yes
fileNamefileName 指定的“folderPath”下的文件的“名称或通配符筛选器”。Name or wildcard filter for the files under the specified "folderPath". 如果没有为此属性指定任何值,则数据集会指向文件夹中的所有文件。If you don't specify a value for this property, the dataset points to all files in the folder.

对于筛选器,允许的通配符为 *(匹配零个或零个以上的字符)和 ?(匹配零个或单个字符)。For filter, the allowed wildcards are * (matches zero or more characters) and ? (matches zero or a single character).
- 示例 1:"fileName": "*.csv"- Example 1: "fileName": "*.csv"
- 示例 2:"fileName": "???20180427.txt"- Example 2: "fileName": "???20180427.txt"
如果实际文件夹名内具有通配符或此转义符,请使用 ^ 进行转义。Use ^ to escape if your actual folder name has wildcard or this escape char inside.
No
modifiedDatetimeStartmodifiedDatetimeStart 文件根据“上次修改时间”属性进行筛选。Files are filtered based on the attribute Last Modified . 如果文件的上次修改时间在 modifiedDatetimeStartmodifiedDatetimeEnd 之间的范围内,则会选中这些文件。The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. 该时间应用于 UTC 时区,格式为“2018-12-01T05:00:00Z”。The time is applied to UTC time zone in the format of 2018-12-01T05:00:00Z .

当你要对大量文件进行文件筛选时,启用此设置会影响数据移动的整体性能。The overall performance of data movement will be affected by enabling this setting when you want to do file filter from large numbers of files.

属性可以为 NULL,这意味着不向数据集应用任何文件特性筛选器。The properties can be NULL, which means that no file attribute filter is applied to the dataset. 如果 modifiedDatetimeStart 具有日期/时间值,但 modifiedDatetimeEnd 为 NULL,则意味着将选中“上次修改时间”属性大于或等于该日期/时间值的文件。When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. 如果 modifiedDatetimeEnd 具有日期/时间值,但 modifiedDatetimeStart 为 NULL,则意味着将选中“上次修改时间”属性小于该日期/时间值的文件。When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
No
modifiedDatetimeEndmodifiedDatetimeEnd 文件根据“上次修改时间”属性进行筛选。Files are filtered based on the attribute Last Modified . 如果文件的上次修改时间在 modifiedDatetimeStartmodifiedDatetimeEnd 之间的范围内,则会选中这些文件。The files are selected if their last modified time is within the range of modifiedDatetimeStart to modifiedDatetimeEnd. 该时间应用于 UTC 时区,格式为“2018-12-01T05:00:00Z”。The time is applied to UTC time zone in the format of 2018-12-01T05:00:00Z .

当你要对大量文件进行文件筛选时,启用此设置会影响数据移动的整体性能。The overall performance of data movement will be affected by enabling this setting when you want to do file filter from large numbers of files.

属性可以为 NULL,这意味着不向数据集应用任何文件特性筛选器。The properties can be NULL, which means that no file attribute filter is applied to the dataset. 如果 modifiedDatetimeStart 具有日期/时间值,但 modifiedDatetimeEnd 为 NULL,则意味着将选中“上次修改时间”属性大于或等于该日期/时间值的文件。When modifiedDatetimeStart has a datetime value but modifiedDatetimeEnd is NULL, it means that the files whose last modified attribute is greater than or equal to the datetime value are selected. 如果 modifiedDatetimeEnd 具有日期/时间值,但 modifiedDatetimeStart 为 NULL,则意味着将选中“上次修改时间”属性小于该日期/时间值的文件。When modifiedDatetimeEnd has a datetime value but modifiedDatetimeStart is NULL, it means that the files whose last modified attribute is less than the datetime value are selected.
No
formatformat 若要在基于文件的存储之间按原样复制文件(二进制副本),可以在输入和输出数据集定义中跳过格式节。If you want to copy files as is between file-based stores (binary copy), skip the format section in both input and output dataset definitions.

若要分析具有特定格式的文件,以下是受支持的文件格式类型:TextFormat、JsonFormat、AvroFormat、OrcFormat 和 ParquetFormat 。If you want to parse files with a specific format, the following file format types are supported: TextFormat , JsonFormat , AvroFormat , OrcFormat , and ParquetFormat . 请将格式中的“type”属性设置为上述值之一。Set the type property under format to one of these values. 有关详细信息,请参阅文本格式Json 格式Avro 格式Orc 格式Parquet 格式部分。For more information, see Text format, Json format, Avro format, Orc format, and Parquet format sections.
否(仅适用于二进制复制方案)No (only for binary copy scenario)
compressioncompression 指定数据的压缩类型和级别。Specify the type and level of compression for the data. 有关详细信息,请参阅受支持的文件格式和压缩编解码器For more information, see Supported file formats and compression codecs.
支持的类型为 GZipDeflateBZip2ZipDeflateSupported types are GZip , Deflate , BZip2 , and ZipDeflate .
支持的级别为“最佳”和“最快”。Supported levels are Optimal and Fastest .
No

提示

如需复制文件夹下的所有文件,请仅指定 folderPathTo copy all files under a folder, specify folderPath only.
若要复制具有指定名称的单个文件,请使用文件夹部分指定 folderPath 并使用文件名指定 fileName。To copy a single file with a specified name, specify folderPath with the folder part and fileName with the file name.
如需复制文件夹下的一部分文件,请使用文件夹部分指定 folderPath 并使用通配符筛选器指定 fileName。To copy a subset of files under a folder, specify folderPath with the folder part and fileName with the wildcard filter.

备注

如果你将“fileFilter”属性用于文件筛选器,则系统仍按原样支持它,但我们建议你从现在起使用添加到“fileName”的新筛选器功能。 If you were using fileFilter property for the file filter, it is still supported as is, but we recommend that you use the new filter capability added to fileName from now on.

示例:Example:

{
    "name": "SFTPDataset",
    "type": "Datasets",
    "properties": {
        "type": "FileShare",
        "linkedServiceName":{
            "referenceName": "<SFTP linked service name>",
            "type": "LinkedServiceReference"
        },
        "typeProperties": {
            "folderPath": "folder/subfolder/",
            "fileName": "*",
            "modifiedDatetimeStart": "2018-12-01T05:00:00Z",
            "modifiedDatetimeEnd": "2018-12-01T06:00:00Z",
            "format": {
                "type": "TextFormat",
                "columnDelimiter": ",",
                "rowDelimiter": "\n"
            },
            "compression": {
                "type": "GZip",
                "level": "Optimal"
            }
        }
    }
}

旧复制活动源模型Legacy Copy activity source model

属性Property 说明Description 必需Required
typetype 复制活动源的 type 属性必须设置为 FileSystemSourceThe type property of the Copy activity source must be set to FileSystemSource Yes
recursiverecursive 指示是要从子文件夹中以递归方式读取数据,还是只从指定的文件夹中读取数据。Indicates whether the data is read recursively from the subfolders or only from the specified folder. 当 recursive 设置为 true 且接收器是基于文件的存储时,将不会在接收器上复制或创建空的文件夹和子文件夹。When recursive is set to true and the sink is a file-based store, empty folders and subfolders won't be copied or created at the sink.
允许的值为 true(默认值)和 falseAllowed values are true (default) and false
No
maxConcurrentConnectionsmaxConcurrentConnections 可以同时连接到存储区存储的连接数。The number of connections that can connect to a storage store concurrently. 仅在要限制与数据存储的并发连接时指定一个数字。Specify a number only when you want to limit the concurrent connections to the data store. No

示例:Example:

"activities":[
    {
        "name": "CopyFromSFTP",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<SFTP input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "FileSystemSource",
                "recursive": true
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

后续步骤Next steps

有关可供 Azure 数据工厂中的复制活动用作源和接收器的数据存储的列表,请参阅支持的数据存储For a list of data stores that are supported as sources and sinks by the Copy activity in Azure Data Factory, see supported data stores.