使用 Azure 数据工厂从 Hive 复制和转换数据Copy and transform data from Hive using Azure Data Factory

适用于: Azure 数据工厂 Azure Synapse Analytics

本文概述了如何使用 Azure 数据工厂中的复制活动从 Hive 复制数据。This article outlines how to use the Copy Activity in Azure Data Factory to copy data from Hive. 它是基于概述复制活动总体的复制活动概述一文。It builds on the copy activity overview article that presents a general overview of copy activity.

支持的功能Supported capabilities

以下活动支持此 Hive 连接器:This Hive connector is supported for the following activities:

可以将数据从 Hive 复制到任何支持的接收器数据存储。You can copy data from Hive to any supported sink data store. 有关复制活动支持作为源/接收器的数据存储列表,请参阅支持的数据存储表。For a list of data stores that are supported as sources/sinks by the copy activity, see the Supported data stores table.

Azure 数据工厂提供内置的驱动程序用于启用连接,因此无需使用此连接器手动安装任何驱动程序。Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.

先决条件Prerequisites

如果数据存储位于本地网络、Azure 虚拟网络或 Amazon Virtual Private Cloud 内部,则需要配置自承载集成运行时才能连接到该数据存储。If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.

另外,如果数据存储是托管的云数据服务,可以使用 Azure 集成运行时。Alternatively, if your data store is a managed cloud data service, you can use Azure integration runtime. 如果访问范围限制为防火墙规则中允许的 IP,你可以选择将 Azure Integration Runtime IP 添加到允许列表。If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs into the allow list.

要详细了解网络安全机制和数据工厂支持的选项,请参阅数据访问策略For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.

入门Getting started

若要使用管道执行复制活动,可以使用以下工具或 SDK 之一:To perform the Copy activity with a pipeline, you can use one of the following tools or SDKs:

对于特定于 Hive 连接器的数据工厂实体,以下部分提供有关用于定义这些实体的属性的详细信息。The following sections provide details about properties that are used to define Data Factory entities specific to Hive connector.

链接服务属性Linked service properties

Hive 链接的服务支持以下属性:The following properties are supported for Hive linked service:

属性Property 说明Description 必选Required
typetype type 属性必须设置为:HiveThe type property must be set to: Hive Yes
hosthost Hive 服务器的 IP 地址或主机名;对于多台主机,将以“;”分隔(仅限启用了 serviceDiscoveryMode 时)。IP address or host name of the Hive server, separated by ';' for multiple hosts (only when serviceDiscoveryMode is enabled). Yes
portport Hive 服务器用来侦听客户端连接的 TCP 端口。The TCP port that the Hive server uses to listen for client connections. 如果连接到 Azure HDInsights,请指定端口 443。If you connect to Azure HDInsights, specify port as 443. Yes
serverTypeserverType Hive 服务器的类型。The type of Hive server.
允许值包括:HiveServer1、HiveServer2、HiveThriftServerAllowed values are: HiveServer1, HiveServer2, HiveThriftServer
No
thriftTransportProtocolthriftTransportProtocol Thrift 层中要使用的传输协议。The transport protocol to use in the Thrift layer.
允许值包括:二进制、SASL、HTTPAllowed values are: Binary, SASL, HTTP
No
authenticationTypeauthenticationType 用于访问 Hive 服务器的身份验证方法。The authentication method used to access the Hive server.
允许值包括:Anonymous、Username、UsernameAndPassword、WindowsAzureHDInsightService 。Allowed values are: Anonymous, Username, UsernameAndPassword, WindowsAzureHDInsightService. 目前不支持 Kerberos 身份验证。Kerberos authentication is not supported now.
Yes
serviceDiscoveryModeserviceDiscoveryMode true 指示使用 ZooKeeper 服务,false 指示不使用。true to indicate using the ZooKeeper service, false not. No
zooKeeperNameSpacezooKeeperNameSpace ZooKeeper 上要将 Hive Server 2 节点添加到其下的命名空间。The namespace on ZooKeeper under which Hive Server 2 nodes are added. No
useNativeQueryuseNativeQuery 指定驱动程序是使用本机 HiveQL 查询,还是将其转换为 HiveQL 中的等效形式。Specifies whether the driver uses native HiveQL queries, or converts them into an equivalent form in HiveQL. No
usernameusername 用于访问 Hive 服务器的用户名。The user name that you use to access Hive Server. No
passwordpassword 用户所对应的密码。The password corresponding to the user. 将此字段标记为 SecureString 以安全地将其存储在数据工厂中或引用存储在 Azure Key Vault 中的机密Mark this field as a SecureString to store it securely in Data Factory, or reference a secret stored in Azure Key Vault. No
httpPathhttpPath 对应于 Hive 服务器的部分 URL。The partial URL corresponding to the Hive server. No
enableSslenableSsl 指定是否使用 TLS 加密到服务器的连接。Specifies whether the connections to the server are encrypted using TLS. 默认值为 false。The default value is false. No
trustedCertPathtrustedCertPath 包含受信任 CA 证书(通过 TLS 进行连接时用来验证服务器)的 .pem 文件的完整路径。The full path of the .pem file containing trusted CA certificates for verifying the server when connecting over TLS. 只有在自承载 IR 上使用 TLS 时才能设置此属性。This property can only be set when using TLS on self-hosted IR. 默认值是随 IR 一起安装的 cacerts.pem 文件。The default value is the cacerts.pem file installed with the IR. No
useSystemTrustStoreuseSystemTrustStore 指定是使用系统信任存储中的 CA 证书还是使用指定 PEM 文件中的 CA 证书。Specifies whether to use a CA certificate from the system trust store or from a specified PEM file. 默认值为 false。The default value is false. No
allowHostNameCNMismatchallowHostNameCNMismatch 指定通过 TLS 进行连接时是否要求 CA 颁发的 TLS/SSL 证书名称与服务器的主机名相匹配。Specifies whether to require a CA-issued TLS/SSL certificate name to match the host name of the server when connecting over TLS. 默认值为 false。The default value is false. No
allowSelfSignedServerCertallowSelfSignedServerCert 指定是否允许来自服务器的自签名证书。Specifies whether to allow self-signed certificates from the server. 默认值为 false。The default value is false. No
connectViaconnectVia 用于连接到数据存储的集成运行时The Integration Runtime to be used to connect to the data store. 先决条件部分了解更多信息。Learn more from Prerequisites section. 如果未指定,则使用默认 Azure Integration Runtime。If not specified, it uses the default Azure Integration Runtime. No
storageReferencestorageReference 对映射数据流中用于暂存数据的存储帐户的链接服务的引用。A reference to the linked service of the storage account used for staging data in mapping data flow. 仅当在映射数据流中使用 Hive 链接服务时,才需要此项This is required only when using the Hive linked service in mapping data flow No

示例:Example:

{
    "name": "HiveLinkedService",
    "properties": {
        "type": "Hive",
        "typeProperties": {
            "host" : "<cluster>.azurehdinsight.cn",
            "port" : "<port>",
            "authenticationType" : "WindowsAzureHDInsightService",
            "username" : "<username>",
            "password": {
                "type": "SecureString",
                "value": "<password>"
            }
        }
    }
}

数据集属性Dataset properties

有关可用于定义数据集的各部分和属性的完整列表,请参阅数据集一文。For a full list of sections and properties available for defining datasets, see the datasets article. 本部分提供了 Hive 数据集支持的属性列表。This section provides a list of properties supported by Hive dataset.

要从 Hive 复制数据,请将数据集的 type 属性设置为 HiveObjectTo copy data from Hive, set the type property of the dataset to HiveObject. 支持以下属性:The following properties are supported:

属性Property 说明Description 必选Required
typetype 数据集的 type 属性必须设置为:HiveObjectThe type property of the dataset must be set to: HiveObject Yes
架构schema 架构的名称。Name of the schema. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)
table 表的名称。Name of the table. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)
tableNametableName 包含架构部分的表的名称。Name of the table including schema part. 支持此属性是为了向后兼容。This property is supported for backward compatibility. 对于新的工作负荷,请使用 schematableFor new workload, use schema and table. 否(如果指定了活动源中的“query”)No (if "query" in activity source is specified)

示例Example

{
    "name": "HiveDataset",
    "properties": {
        "type": "HiveObject",
        "typeProperties": {},
        "schema": [],
        "linkedServiceName": {
            "referenceName": "<Hive linked service name>",
            "type": "LinkedServiceReference"
        }
    }
}

复制活动属性Copy activity properties

有关可用于定义活动的各部分和属性的完整列表,请参阅管道一文。For a full list of sections and properties available for defining activities, see the Pipelines article. 本部分提供了 Hive 源支持的属性列表。This section provides a list of properties supported by Hive source.

HiveSource 作为源HiveSource as source

要从 Hive 复制数据,请将复制活动中的源类型设置为 HiveSourceTo copy data from Hive, set the source type in the copy activity to HiveSource. 复制活动 source 部分支持以下属性:The following properties are supported in the copy activity source section:

属性Property 说明Description 必选Required
typetype 复制活动 source 的 type 属性必须设置为:HiveSourceThe type property of the copy activity source must be set to: HiveSource Yes
queryquery 使用自定义 SQL 查询读取数据。Use the custom SQL query to read data. 例如:"SELECT * FROM MyTable"For example: "SELECT * FROM MyTable". 否(如果指定了数据集中的“tableName”)No (if "tableName" in dataset is specified)

示例:Example:

"activities":[
    {
        "name": "CopyFromHive",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<Hive input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "HiveSource",
                "query": "SELECT * FROM MyTable"
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

映射数据流属性Mapping data flow properties

支持将 hive 连接器用作映射数据流中的内联数据集源。The hive connector is supported as an inline dataset source in mapping data flows. 使用查询进行读取,或直接从 HDInsight 中的 Hive 表进行读取。Read using a query or directly from a Hive table in HDInsight. 在转换为数据流的一部分之前,Hive 数据作为 parquet 文件暂存在存储帐户中。Hive data gets staged in a storage account as parquet files before getting transformed as part of a data flow.

源属性Source properties

下表列出了 hive 源支持的属性。The below table lists the properties supported by a hive source. 你可以在“源选项”选项卡中编辑这些属性。You can edit these properties in the Source options tab.

名称Name 说明Description 必需Required 允许的值Allowed values 数据流脚本属性Data flow script property
存储Store 存储必须是 hiveStore must be hive yes hive storestore
格式Format 是从表中还是从查询中读取Whether you are reading from a table or query yes tablequerytable or query formatformat
架构名称Schema name 如果从表中读取,则为源表的架构If reading from a table, the schema of the source table 如果格式为 table,则此项是必需的yes, if format is table 字符串String schemaNameschemaName
表名Table name 如果从表中读取,则为表名If reading from a table, the table name 如果格式为 table,则此项是必需的yes, if format is table 字符串String tableNametableName
查询Query 如果格式为 query,则为 Hive 链接服务上的源查询If format is query, the source query on the Hive linked service 如果格式为 query,则此项是必需的yes, if format is query 字符串String queryquery
暂存Staged 将始终暂存 Hive 表。Hive table will always be staged. yes true stagedstaged
存储容器Storage Container 从 Hive 中读取或写入到 Hive 之前用于暂存数据的存储容器。Storage container used to stage data before reading from Hive or writing to Hive. Hive 群集必须有权访问此容器。The hive cluster must have access to this container. yes 字符串String storageContainerstorageContainer
临时数据库Staging database 在链接服务中指定的用户帐户有权访问的架构/数据库。The schema/database where the user account specified in the linked service has access to. 它用于在暂存过程中创建外部表,在之后将被删除It is used to create external tables during staging and dropped afterwards no truefalsetrue or false stagingDatabaseNamestagingDatabaseName
预处理 SQL 脚本Pre SQL Scripts 在读取数据之前要在 Hive 表上运行的 SQL 代码SQL code to run on the Hive table before reading the data no 字符串String preSQLspreSQLs

源示例Source example

下面是 Hive 源配置的示例:Below is an example of a Hive source configuration:

Hive 源示例Hive source example

这些设置将转换为以下数据流脚本:These settings translate into the following data flow script:

source(
    allowSchemaDrift: true,
    validateSchema: false,
    ignoreNoFilesFound: false,
    format: 'table',
    store: 'hive',
    schemaName: 'default',
    tableName: 'hivesampletable',
    staged: true,
    storageContainer: 'khive',
    storageFolderPath: '',
    stagingDatabaseName: 'default') ~> hivesource

已知限制Known limitations

  • 不支持将复杂类型(例如数组、映射、结构和联合)用于读取。Complex types such as arrays, maps, structs, and unions are not supported for read.
  • Hive 连接器仅支持 4.0 或更高版本的 Azure HDInsight 中的 Hive 表 (Apache Hive 3.1.0)Hive connector only supports Hive tables in Azure HDInsight of version 4.0 or greater (Apache Hive 3.1.0)

Lookup 活动属性Lookup activity properties

若要了解有关属性的详细信息,请查看 Lookup 活动To learn details about the properties, check Lookup activity.

后续步骤Next steps

有关 Azure 数据工厂中复制活动支持作为源和接收器的数据存储的列表,请参阅支持的数据存储For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see supported data stores.