Azure IoT Edge 流分析Azure Stream Analytics on IoT Edge

IoT Edge 上的 Azure 流分析 (ASA) 可让开发人员将近乎实时的分析智能更近地部署到 IoT 设备,以便他们能够使设备生成的数据发挥出全部价值。Azure Stream Analytics (ASA) on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data. Azure 流分析专为实现低延迟、复原能力、有效使用带宽和合规性而设计。Azure Stream Analytics is designed for low latency, resiliency, efficient use of bandwidth, and compliance. 企业现在可以将控制逻辑部署到接近工业运营的位置,并补充在云中完成的大数据分析。Enterprises can now deploy control logic close to the industrial operations and complement Big Data analytics done in the cloud.

IoT Edge 上的 Azure 流分析在 Azure IoT Edge 框架中运行。Azure Stream Analytics on IoT Edge runs within the Azure IoT Edge framework. 在 ASA 中创建作业后,便可使用 IoT 中心进行部署和管理。Once the job is created in ASA, you can deploy and manage it using IoT Hub.


IoT Edge 的高级别关系图

  • 低延迟命令和控制:例如,生产安全系统时必须以超低的延迟响应运行数据。Low-latency command and control: For example, manufacturing safety systems must respond to operational data with ultra-low latency. 借助 IoT Edge 上的 ASA,可以近乎实时地分析传感器数据,并在检测到异常情况时发出命令,从而停止计算机或触发警报。With ASA on IoT Edge, you can analyze sensor data in near real-time, and issue commands when you detect anomalies to stop a machine or trigger alerts.
  • 与云的连接受限:任务关键型系统(如远程采矿设备、连接的船舶或海上钻井)需要分析数据并对数据做出反应,即使云连接是间歇性的也是如此。Limited connectivity to the cloud: Mission critical systems, such as remote mining equipment, connected vessels, or offshore drilling, need to analyze and react to data even when cloud connectivity is intermittent. 使用 ASA,流式处理逻辑可独立于网络连接运行,你可以选择发送到云以作进一步处理或存储的内容。With ASA, your streaming logic runs independently of the network connectivity and you can choose what you send to the cloud for further processing or storage.
  • 有限的带宽:由喷气引擎或联网汽车生成的数据量可能非常大,因此,在将数据发送到云之前必须对数据进行筛选或预处理。Limited bandwidth: The volume of data produced by jet engines or connected cars can be so large that data must be filtered or pre-processed before sending it to the cloud. 使用 ASA,可以筛选或聚合需要发送到云的数据。Using ASA, you can filter or aggregate the data that needs to be sent to the cloud.
  • 符合性:监管符合性可能需要在将一些数据发送到云之前对其进行本地匿名或聚合处理。Compliance: Regulatory compliance may require some data to be locally anonymized or aggregated before being sent to the cloud.

Azure 流分析作业中的 Edge 作业Edge jobs in Azure Stream Analytics

什么是“Edge”作业?What is an "edge" job?

ASA Edge 作业在部署到 Azure IoT Edge 设备的容器中运行。ASA Edge jobs run in containers deployed to Azure IoT Edge devices. 它们由两个部分组成:They are composed of two parts:

  1. 负责作业定义的云部分:用户在云中定义输入、输出、查询和其他设置(无序事件等)。A cloud part that is responsible for job definition: users define inputs, output, query, and other settings (out of order events, etc.) in the cloud.
  2. 在 IoT 设备上运行的模块。A module running on your IoT devices. 它包含 ASA 引擎,并从云接收作业定义。It contains the ASA engine and receives the job definition from the cloud.

ASA 使用 IoT 中心将 Edge 作业部署到设备。ASA uses IoT Hub to deploy edge jobs to device(s). 可在此处查看有关 IoT Edge 部署的详细信息。More information about IoT Edge deployment can be seen here.

Azure 流分析 Edge 作业

安装说明Installation instructions

下表描述了高级步骤:The high-level steps are described in the following table. 下面的部分将进行详细说明。More details are given in the following sections.

步骤Step 注释Notes
创建存储容器Create a storage container 存储容器用于保存作业定义,IoT 设备可在其中进行访问它们。Storage containers are used to save your job definition where they can be accessed by your IoT devices.
你可以重用任何现有的存储容器。You can reuse any existing storage container.
创建 ASA 边缘作业Create an ASA edge job 创建新的作业,选择“Edge”作为“宿主环境”。Create a new job, select Edge as hosting environment.
这些作业从云创建/管理,并在你自己的 IoT Edge 设备上运行。These jobs are created/managed from the cloud, and run on your own IoT Edge devices.
在设备上设置 IoT Edge 环境Setup your IoT Edge environment on your device(s) WindowsLinux 说明。Instructions for Windows or Linux.
在 IoT Edge 设备上部署 ASADeploy ASA on your IoT Edge device(s) ASA 作业定义被导出到之前创建的存储容器。ASA job definition is exported to the storage container created earlier.

你可以按照本分步教程在 IoT Edge 上部署你的第一个 ASA 作业。You can follow this step-by-step tutorial to deploy your first ASA job on IoT Edge. 下面的视频可帮助你了解在 IoT Edge 设备上运行流分析作业的过程:The following video should help you understand the process to run a Stream Analytics job on an IoT edge device:

创建存储容器Create a storage container

需要存储容器才能导出 ASA 已编译查询和作业配置。A storage container is required in order to export the ASA compiled query and the job configuration. 它用于使用特定查询来配置 ASA Docker 映像。It is used to configure the ASA Docker image with your specific query.

  1. 请按照这些说明在 Azure 门户创建存储帐户。Follow these instructions to create a storage account from the Azure portal. 你可以保留所有默认选项以在 ASA 中使用此帐户。You can keep all default options to use this account with ASA.
  2. 在新创建的存储帐户中,创建一个 blob 存储容器:In the newly created storage account, create a blob storage container:
    1. 依次单击“Blob”和“+ 容器” 。Click on Blobs, then + Container.
    2. 输入名称,并将容器保留为“专用”。Enter a name and keep the container as Private.

创建 ASA Edge 作业Create an ASA Edge job

  1. 在 Azure 门户创建一个新的“流分析作业”。From the Azure portal, create a new "Stream Analytics job". 在此处创建新的 ASA 作业的直接链接Direct link to create a new ASA job here.

  2. 在“创建”屏幕中,选择“Edge”作为“宿主环境”(请参阅下图)In the creation screen, select Edge as hosting environment (see the following picture)

    在 Edge 上创建流分析作业

  3. 作业定义Job Definition

    1. 定义输入流。Define Input Stream(s). 为作业定义一个或多个输入流。Define one or several input streams for your job.
    2. 定义参考数据(可选)。Define Reference data (optional).
    3. 定义输出流。Define Output Stream(s). 为作业定义一个或多个输出流。Define one or several outputs streams for your job.
    4. 定义查询。Define query. 在云中使用内联编辑器定义 ASA 查询。Define the ASA query in the cloud using the inline editor. 编译器将为 ASA Edge 自动启用语法检查。The compiler automatically checks the syntax enabled for ASA edge. 此外,还可以通过上传示例数据来测试你的查询。You can also test your query by uploading sample data.
  4. 在“IoT Edge 设置”菜单中设置存储容器信息。Set the storage container information in the IoT Edge settings menu.

  5. 设置可选设置Set optional settings

    1. 事件排序。Event ordering. 你可以在门户中配置无序策略。You can configure out-of-order policy in the portal. 此处可获取文档。Documentation is available here.
    2. 区域设置。Locale. 设置内部化格式。Set the internalization format.


创建部署时,ASA 将作业定义导出到存储容器。When a deployment is created, ASA exports the job definition to a storage container. 此作业定义在部署期间保持不变。This job definition remain the same during the duration of a deployment. 因此,如果要更新在 Edge 上运行的作业,则需要在 ASA 中编辑作业,然后在 IoT 中心创建新的部署。As a consequence, if you want to update a job running on the edge, you need to edit the job in ASA, and then create a new deployment in IoT Hub.

在设备上设置 IoT Edge 环境Set up your IoT Edge environment on your device(s)

边缘作业可以部署在运行 Azure IoT Edge 的设备上。Edge jobs can be deployed on devices running Azure IoT Edge. 为此,需要执行以下步骤:For this, you need to follow these steps:

  • 创建 IoT 中心。Create an Iot Hub.
  • 在 Edge 设备上安装 Docker 和 IoT Edge 运行时。Install Docker and IoT Edge runtime on your edge devices.
  • 在 IoT 中心将设备设置为“IoT Edge 设备”。Set your devices as IoT Edge devices in IoT Hub.

这些步骤在针对 WindowsLinux 的 IoT Edge 文档中有相关描述。These steps are described in the IoT Edge documentation for Windows or Linux.

在 IoT Edge 设备上部署 ASADeployment ASA on your IoT Edge device(s)

将 ASA 添加到部署Add ASA to your deployment
  • 在 Azure 门户中,打开 IoT 中心,导航到 IoT Edge,并单击要用于此部署的设备。In the Azure portal, open IoT Hub, navigate to IoT Edge and click on the device you want to target for this deployment.
  • 选择“设置模块”,然后依次选择“+ 添加”、“Azure 流分析模块”。Select Set modules, then select + Add and choose Azure Stream Analytics Module.
  • 选择订阅和你创建的 ASA Edge 作业。Select the subscription and the ASA Edge job that you created. 单击“保存”。Click Save. 在部署中添加 ASA 模块Add ASA module in your deployment


在此步骤中,ASA 会在存储容器中创建一个名为“EdgeJobs”的文件夹(如果该文件夹尚不存在)。During this step, ASA creates a folder named "EdgeJobs" in the storage container (if it does not exist already). 对于每项部署,“EdgeJobs”文件夹中都将创建一个新的子文件夹。For each deployment, a new subfolder is created in the "EdgeJobs" folder. 将作业部署到 IoT Edge 设备时,ASA 会为作业定义文件创建共享访问签名 (SAS)。When you deploy your job to IoT Edge devices, ASA creates a shared access signature (SAS) for the job definition file. SAS 密钥使用设备孪生安全地传输到 IoT Edge 设备。The SAS key is securely transmitted to the IoT Edge devices using device twin. 此密钥将在其创建之日后的三年过期。The expiration of this key is three years from the day of its creation. 更新 IoT Edge 作业时,SAS 将更改,但映像版本不会更改。When you update an IoT Edge job, the SAS will change, but the image version will not change. 更新后,请遵循部署工作流,并在设备上记录更新通知。Once you Update, follow the deployment workflow, and an update notification is logged on the device.

有关 IoT Edge 部署的详细信息,请参阅此页For more information about IoT Edge deployments, see to this page.

配置路由Configure routes

IoT Edge 提供了一种在模块之间,以及模块和 IoT 中心之间以声明方式路由消息的方法。IoT Edge provides a way to declaratively route messages between modules, and between modules and IoT Hub. 此处描述了完整的语法。The full syntax is described here. 在 ASA 作业中创建的输入和输出的名称可以用作路由的终结点。Names of the inputs and outputs created in the ASA job can be used as endpoints for routing.

    "routes": {
        "sensorToAsa":   "FROM /messages/modules/tempSensor/* INTO BrokeredEndpoint(\"/modules/ASA/inputs/temperature\")",
        "alertsToCloud": "FROM /messages/modules/ASA/* INTO $upstream",
        "alertsToReset": "FROM /messages/modules/ASA/* INTO BrokeredEndpoint(\"/modules/tempSensor/inputs/control\")"

此示例演示下图中所述的方案的路由。This example shows the routes for the scenario described in the following picture. 它包含名为“ASA”的边缘作业,以及名为“temperature”的输入和和名为“alert”的输出。It contains an edge job called "ASA", with an input named "temperature" and an output named "alert". 消息路由关系图示例Diagram example of message routing

以下示例定义了以下路由:This example defines the following routes:

  • 来自 tempSensor 的每个消息均发送到名为 ASA 的模块,再到名为“temperature”的输入,Every message from the tempSensor is sent to the module named ASA to the input named temperature,
  • ASA 模块的所有输出均发送到链接此设备的 IoT 中心 ($upstream),All outputs of ASA module are sent to the IoT Hub linked to this device ($upstream),
  • ASA 模块的所有输出均发送到 tempSensor 的控制终结点。All outputs of ASA module are sent to the control endpoint of the tempSensor.

技术信息Technical information

与云作业相比,IoT Edge 作业当前受限制Current limitations for IoT Edge jobs compared to cloud jobs

目标是在 IoT Edge 作业和云作业之间进行平衡。The goal is to have parity between IoT Edge jobs and cloud jobs. 支持大多数 SQL 查询语言功能,从而能够在云和 IoT Edge 上运行相同的逻辑。Most SQL query language features are supported, enabling to run the same logic on both cloud and IoT Edge. 但是,以下功能尚不支持 Edge 作业:However the following features are not yet supported for edge jobs:

  • JavaScript 中的用户定义函数 (UDF)。User-defined functions (UDF) in JavaScript. 用于 IoT Edge 作业的 C#(预览版)中提供了 UDF。UDF are available in C# for IoT Edge jobs (preview).
  • 用户定义聚合 (UDA)。User-defined aggregates (UDA).
  • Azure ML 函数。Azure ML functions.
  • 在单个步骤中使用超过 14 个聚合。Using more than 14 aggregates in a single step.
  • 用于输入/输出的 AVRO 格式。AVRO format for input/output. 目前仅支持 CSV 和 JSON。At this time, only CSV and JSON are supported.
  • 以下 SQL 运算符:The following SQL operators:
    • GetMetadataPropertyValueGetMetadataPropertyValue
  • 延迟到达策略Late arrival policy

运行时和硬件要求Runtime and hardware requirements

ASA 和 Azure IoT Edge 使用 Docker 容器来提供可在多个主机操作系统(Windows、Linux)上运行的便携式解决方案。ASA and Azure IoT Edge use Docker containers to provide a portable solution that runs on multiple host operating systems (Windows, Linux).

IoT Edge 上的 ASA 可用作 Windows 和 Linux 映像运行于 x86-64 或 ARM(高级 RISC 计算机)架构之上。ASA on IoT Edge is made available as Windows and Linux images, running on both x86-64 or ARM (Advanced RISC Machines) architectures.

输入和输出Input and output

输入和输出流Input and Output Streams

ASA Edge 作业可以从在 IoT Edge 设备上运行的其他模块获取输入和输出。ASA Edge jobs can get inputs and outputs from other modules running on IoT Edge devices. 要与特定模块实现相互连接,你可以在部署时设置路由配置。To connect from and to specific modules, you can set the routing configuration at deployment time. 有关详细信息,请参阅 IoT Edge 模块组成文档More information is described on the IoT Edge module composition documentation.

输入和输出均支持 CSV 和 JSON 格式。For both inputs and outputs, CSV and JSON formats are supported.

对于在 ASA 作业中创建的每个输入和输出流,都将在部署的模块上创建相应的终结点。For each input and output stream you create in your ASA job, a corresponding endpoint is created on your deployed module. 这些终结点可以用于部署的路由。These endpoints can be used in the routes of your deployment.

目前,唯一支持的流输入和流输出类型是“Edge 中心”。At present, the only supported stream input and stream output types are Edge Hub. 参考输入支持参考文件类型。Reference input supports reference file type. 可以使用下游的云作业访问其他输出。Other outputs can be reached using a cloud job downstream. 例如,在 Edge 中托管的流分析作业会将输出发送到 Edge 中心,然后后者可以将输出发送到 IoT 中心。For example, a Stream Analytics job hosted in Edge sends output to Edge Hub, which can then send output to IoT Hub. 可以使用第二个云托管的 Azure 流分析作业,该作业的输入来自 IoT 中心,并输出到 Power BI 或其他输出类型。You can use a second cloud hosted Azure Stream Analytics job with input from IoT Hub and output to Power BI or another output type.

引用数据Reference data

参考数据(也称为查找表)是一个静态的或本质上缓慢变化的有限数据集。Reference data (also known as a lookup table) is a finite data set that is static or slow changing in nature. 可用于执行查找或与数据流相关联。It is used to perform a lookup or to correlate with your data stream. 为了在 Azure 流分析作业中利用参考数据,通常会在查询中使用参考数据联接To make use of reference data in your Azure Stream Analytics job, you will generally use a Reference Data JOIN in your query. 有关详细信息,请参阅在流分析中使用参考数据进行查找For more information, see the Using reference data for lookups in Stream Analytics.

仅支持本地参考数据。Only local reference data is supported. 将作业部署到 IoT Edge 设备时,它将从用户定义的文件路径中加载参考数据。When a job is deployed to IoT Edge device, it loads reference data from the user defined file path.

若要在 Edge 上创建包含参考数据的作业,请执行以下操作:To create a job with reference data on Edge:

  1. 为作业创建一个新输入。Create a new input for your job.

  2. 选择“参考数据”作为”源类型“。Choose Reference data as the Source Type.

  3. 在设备上将参考数据文件准备就绪。Have a reference data file ready on the device. 对于 Windows 容器,请将参考数据文件放置在本地驱动器上并通过 Docker 容器共享本地驱动器。For a Windows container, put the reference data file on the local drive and share the local drive with the Docker container. 对于 Linux 容器,请创建一个 Docker 卷并将该数据文件填充到该卷。For a Linux container, create a Docker volume and populate the data file to the volume.

  4. 设置文件路径。Set the file path. 对于 Windows 主机 OS 和 Windows 容器,请使用绝对路径:E:\<PathToFile>\v1.csvFor Windows Host OS and Windows container, use the absolute path: E:\<PathToFile>\v1.csv. 对于 Windows 主机 OS 和 Linux 容器或 Linux OS 以及 Linux 容器,请使用卷中的路径:<VolumeName>/file1.txtFor a Windows Host OS and Linux container or a Linux OS and Linux container, use the path in the volume: <VolumeName>/file1.txt.

为 IoT Edge 上的 Azure 流分析作业新建参考数据输入

IoT Edge 上的参考数据更新将由部署触发。The reference data on IoT Edge update is triggered by a deployment. 在触发后,ASA 模块选取更新的数据且不停止正在运行的作业。Once triggered, the ASA module picks the updated data without stopping the running job.

有两种方式可用来更新参考数据:There are two ways to update the reference data:

  • 从 Azure 门户中更新 ASA 作业中的参考数据路径。Update reference data path in your ASA job from Azure portal.
  • 更新 IoT Edge 部署。Update the IoT Edge deployment.

许可证和第三方通知License and third-party notices

Azure 流分析模块映像信息Azure Stream Analytics module image information

此版本信息上次更新时间为 2019-06-27:This version information was last updated on 2019-06-27:

  • 映像

    • 基础映像:microsoft/dotnet:2.1.6-runtime-alpine3.7base image: microsoft/dotnet:2.1.6-runtime-alpine3.7
    • 平台:platform:
      • 体系结构:amd64architecture: amd64
      • os:linuxos: linux
  • 映像

    • 基础映像:microsoft/dotnet:2.1.6-runtime-bionic-arm32v7base image: microsoft/dotnet:2.1.6-runtime-bionic-arm32v7
    • 平台:platform:
      • 体系结构:armarchitecture: arm
      • os:linuxos: linux
  • 映像

    • 基础映像:microsoft/dotnet:2.1.6-runtime-nanoserver-1809base image: microsoft/dotnet:2.1.6-runtime-nanoserver-1809
    • 平台:platform:
      • 体系结构:amd64architecture: amd64
      • os:windowsos: windows

获取帮助Get help

若要获得进一步的帮助,可前往 Azure 流分析的 Microsoft 问答页For further assistance, try the Microsoft Q&A question page for Azure Stream Analytics.

后续步骤Next steps