使用 Azure 数据工厂从 Hive 复制和转换数据Copy and transform data from Hive using Azure Data Factory

适用于: Azure 数据工厂 Azure Synapse Analytics

本文概述了如何使用 Azure 数据工厂中的复制活动从 Hive 复制数据。This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Hive. 它是基于概述复制活动总体的复制活动概述一文。It builds on the copy activity overview article that presents a general overview of copy activity.

支持的功能Supported capabilities

以下活动支持此 Hive 连接器:This Hive connector is supported for the following activities:

可以将数据从 Hive 复制到任何支持的接收器数据存储。You can copy data from Hive to any supported sink data store. 有关复制活动支持作为源/接收器的数据存储列表,请参阅支持的数据存储表。For a list of data stores that are supported as sources/sinks by the copy activity, see the Supported data stores table.

Azure 数据工厂提供内置的驱动程序用于启用连接,因此无需使用此连接器手动安装任何驱动程序。Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.


如果数据存储位于本地网络、Azure 虚拟网络或 Amazon Virtual Private Cloud 内部,则需要配置自承载集成运行时才能连接到该数据存储。If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.

另外,如果数据存储是托管的云数据服务,可以使用 Azure 集成运行时。Alternatively, if your data store is a managed cloud data service, you can use Azure integration runtime. 如果访问范围限制为防火墙规则中允许的 IP,你可以选择将 Azure Integration Runtime IP 添加到允许列表。If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs into the allow list.

要详细了解网络安全机制和数据工厂支持的选项,请参阅数据访问策略For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.

入门Getting started

若要使用管道执行复制活动,可以使用以下工具或 SDK 之一:To perform the Copy activity with a pipeline, you can use one of the following tools or SDKs:

对于特定于 Hive 连接器的数据工厂实体,以下部分提供有关用于定义这些实体的属性的详细信息。The following sections provide details about properties that are used to define Data Factory entities specific to Hive connector.

链接服务属性Linked service properties

Hive 链接的服务支持以下属性:The following properties are supported for Hive linked service:

属性Property 说明Description 必须Required
typetype type 属性必须设置为:HiveThe type property must be set to: Hive Yes
hosthost Hive 服务器的 IP 地址或主机名;对于多台主机,将以“;”分隔(仅限启用了 serviceDiscoveryMode 时)。IP address or host name of the Hive server, separated by ';' for multiple hosts (only when serviceDiscoveryMode is enabled). Yes
portport Hive 服务器用来侦听客户端连接的 TCP 端口。The TCP port that the Hive server uses to listen for client connections. 如果连接到 Azure HDInsights,请指定端口 443。If you connect to Azure HDInsights, specify port as 443. Yes
serverTypeserverType Hive 服务器的类型。The type of Hive server.
允许值包括:HiveServer1、HiveServer2、HiveThriftServerAllowed values are: HiveServer1, HiveServer2, HiveThriftServer
thriftTransportProtocolthriftTransportProtocol Thrift 层中要使用的传输协议。The transport protocol to use in the Thrift layer.
允许值包括:二进制、SASL、HTTPAllowed values are: Binary, SASL, HTTP
authenticationTypeauthenticationType 用于访问 Hive 服务器的身份验证方法。The authentication method used to access the Hive server.
允许值包括:Anonymous、Username、UsernameAndPassword、WindowsAzureHDInsightService 。Allowed values are: Anonymous, Username, UsernameAndPassword, WindowsAzureHDInsightService. 目前不支持 Kerberos 身份验证。Kerberos authentication is not supported now.
serviceDiscoveryModeserviceDiscoveryMode true 指示使用 ZooKeeper 服务,false 指示不使用。true to indicate using the ZooKeeper service, false not. No
zooKeeperNameSpacezooKeeperNameSpace ZooKeeper 上要将 Hive Server 2 节点添加到其下的命名空间。The namespace on ZooKeeper under which Hive Server 2 nodes are added. No
useNativeQueryuseNativeQuery 指定驱动程序是使用本机 HiveQL 查询,还是将其转换为 HiveQL 中的等效形式。Specifies whether the driver uses native HiveQL queries, or converts them into an equivalent form in HiveQL. No
usernameusername 用于访问 Hive 服务器的用户名。The user name that you use to access Hive Server. No
passwordpassword 用户所对应的密码。The password corresponding to the user. 将此字段标记为 SecureString 以安全地将其存储在数据工厂中或引用存储在 Azure Key Vault 中的机密Mark this field as a SecureString to store it securely in Data Factory, or reference a secret stored in Azure Key Vault. No
httpPathhttpPath 对应于 Hive 服务器的部分 URL。The partial URL corresponding to the Hive server. No
enableSslenableSsl 指定是否使用 TLS 加密到服务器的连接。Specifies whether the connections to the server are encrypted using TLS. 默认值为 false。The default value is false. No
trustedCertPathtrustedCertPath 包含受信任 CA 证书(通过 TLS 进行连接时用来验证服务器)的 .pem 文件的完整路径。The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. 只有在自承载 IR 上使用 TLS 时才能设置此属性。This property can only be set when using TLS on self-hosted IR. 默认值是随 IR 一起安装的 cacerts.pem 文件。The default value is the cacerts.pem file installed with the IR. No
useSystemTrustStoreuseSystemTrustStore 指定是使用系统信任存储中的 CA 证书还是使用指定 PEM 文件中的 CA 证书。Specifies whether to use a CA certificate from the system trust store or from a specified PEM file. 默认值为 false。The default value is false. No
allowHostNameCNMismatchallowHostNameCNMismatch 指定通过 TLS 进行连接时是否要求 CA 颁发的 TLS/SSL 证书名称与服务器的主机名相匹配。Specifies whether to require a CA-issued TLS/SSL certificate name to match the host name of the server when connecting over TLS. 默认值为 false。The default value is false. No
allowSelfSignedServerCertallowSelfSignedServerCert 指定是否允许来自服务器的自签名证书。Specifies whether to allow self-signed certificates from the server. 默认值为 false。The default value is false. No
connectViaconnectVia 用于连接到数据存储的集成运行时The Integration Runtime to be used to connect to the data store. 先决条件部分了解更多信息。Learn more from Prerequisites section. 如果未指定,则使用默认 Azure Integration Runtime。If not specified, it uses the default Azure Integration Runtime. No


    "name": "HiveLinkedService",
    "properties": {
        "type": "Hive",
        "typeProperties": {
            "host" : "<cluster>.azurehdinsight.cn",
            "port" : "<port>",
            "authenticationType" : "WindowsAzureHDInsightService",
            "username" : "<username>",
            "password": {
                "type": "SecureString",
                "value": "<password>"

数据集属性Dataset properties

有关可用于定义数据集的各部分和属性的完整列表,请参阅数据集一文。For a full list of sections and properties available for defining datasets, see the datasets article. 本部分提供了 Hive 数据集支持的属性列表。This section provides a list of properties supported by Hive dataset.

要从 Hive 复制数据,请将数据集的 type 属性设置为 HiveObjectTo copy data from Hive, set the type property of the dataset to HiveObject. 支持以下属性:The following properties are supported:

属性Property 说明Description 必需Required
typetype 数据集的 type 属性必须设置为:HiveObjectThe type property of the dataset must be set to: HiveObject Yes
schemaschema 架构的名称。Name of the schema. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)
table 表的名称。Name of the table. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)
tableNametableName 包含架构部分的表的名称。Name of the table including schema part. 支持此属性是为了向后兼容。This property is supported for backward compatibility. 对于新的工作负荷,请使用 schematableFor new workload, use schema and table. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)


    "name": "HiveDataset",
    "properties": {
        "type": "HiveObject",
        "typeProperties": {},
        "schema": [],
        "linkedServiceName": {
            "referenceName": "<Hive linked service name>",
            "type": "LinkedServiceReference"

复制活动属性Copy activity properties

有关可用于定义活动的各部分和属性的完整列表,请参阅管道一文。For a full list of sections and properties available for defining activities, see the Pipelines article. 本部分提供了 Hive 源支持的属性列表。This section provides a list of properties supported by Hive source.

HiveSource 作为源HiveSource as source

要从 Hive 复制数据,请将复制活动中的源类型设置为 HiveSourceTo copy data from Hive, set the source type in the copy activity to HiveSource. 复制活动 source 部分支持以下属性:The following properties are supported in the copy activity source section:

属性Property 说明Description 必需Required
typetype 复制活动 source 的 type 属性必须设置为:HiveSourceThe type property of the copy activity source must be set to: HiveSource Yes
queryquery 使用自定义 SQL 查询读取数据。Use the custom SQL query to read data. 例如:"SELECT * FROM MyTable"For example: "SELECT * FROM MyTable". 否(如果指定了数据集中的“tableName”)No (if "tableName" in dataset is specified)


        "name": "CopyFromHive",
        "type": "Copy",
        "inputs": [
                "referenceName": "<Hive input dataset name>",
                "type": "DatasetReference"
        "outputs": [
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
        "typeProperties": {
            "source": {
                "type": "HiveSource",
                "query": "SELECT * FROM MyTable"
            "sink": {
                "type": "<sink type>"

Lookup 活动属性Lookup activity properties

若要了解有关属性的详细信息,请查看 Lookup 活动To learn details about the properties, check Lookup activity.

后续步骤Next steps

有关 Azure 数据工厂中复制活动支持作为源和接收器的数据存储的列表,请参阅支持的数据存储For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see supported data stores.