将 Azure 监视数据流式传输到事件中心Stream Azure monitoring data to an event hub

Azure Monitor 为 Azure、其他云和本地的应用程序与服务提供全堆栈监视解决方案。Azure Monitor provides a complete full stack monitoring solution for applications and services in Azure, in other clouds, and on-premises. 除了使用 Azure Monitor 分析数据并将数据用于不同的监视方案以外,可能还需要将其发送到环境中的其他监视工具。In addition to using Azure Monitor for analyzing that data and leveraging it for different monitoring scenarios, you may need to send it to other monitoring tools in your environment. 在大多数情况下,将监视数据流式传输到外部工具的最有效方法是使用 Azure 事件中心The most effective method to stream monitoring data to external tools in most cases is using Azure Event Hubs. 本文简要介绍了如何将不同源中的监视数据流式传输到事件中心,并提供详细指南的链接。This article provides a brief description for how you can stream monitoring data from different sources to an event hub and links to detailed guidance.

创建事件中心命名空间Create an Event Hubs namespace

在针对任何数据源配置流式传输之前,需要创建事件中心命名空间和事件中心Before you configure streaming for any data source, you need to create an Event Hubs namespace and event hub. 此命名空间和事件中心是所有监视数据的目标。This namespace and event hub is the destination for all of your monitoring data. 事件中心命名空间是共享相同访问策略的事件中心的逻辑分组,就像存储帐户中有各个 blob 一样。An Event Hubs namespace is a logical grouping of event hubs that share the same access policy, much like a storage account has individual blobs within that storage account. 请注意以下有关用于流式传输监视数据的事件中心命名空间和事件中心的详细信息:Consider the following details about the event hubs namespace and event hubs that you use for streaming monitoring data:

  • 使用吞吐量单位数,可增加事件中心的吞吐量规模。The number of throughput units allows you to increase throughput scale for your event hubs. 通常只需要一个吞吐量单位。Only one throughput unit is typically necessary. 如果需要在日志使用量增加时纵向扩展,可以手动增加命名空间的吞吐量单位数或启用自动扩充。If you need to scale up as your log usage increases, you can manually increase the number of throughput units for the namespace or enable auto inflation.
  • 使用分区数可以在多个使用者之间并行使用。The number of partitions allows you to parallelize consumption across many consumers. 单个分区最多支持 20MBps,或者大约每秒 20,000 条消息。A single partition can support up to 20MBps or approximately 20,000 messages per second. 不一定支持从多个分区使用,具体取决于使用数据的工具。Depending on the tool consuming the data, it may or may not support consuming from multiple partitions. 如果不确定要设置的分区数量,那么从四个分区开始是合理的。Four partitions is reasonable to start with if you're not sure about the number of partitions to set.
  • 将事件中心的消息保留期设置为至少 7 天。You set message retention on your event hub to at least 7 days. 如果使用的工具多天出现故障,这可确保该工具可以从它中断的位置重新开始(因为事件最多可保存 7 天)。If your consuming tool goes down for more than a day, this ensures that the tool can pick up where it left off for events up to 7 days old.
  • 应该对事件中心使用默认的使用者组。You should use the default consumer group for your event hub. 除非你打算使用两个不同的工具使用同一事件中心内的相同数据,否则无需创建其他使用者组或使用单独的使用者组。There is no need to create other consumer groups or use a separate consumer group unless you plan to have two different tools consume the same data from the same event hub.
  • 对于 Azure 活动日志,可选择事件中心命名空间,Azure Monitor 将在该命名空间内创建名为 insights-logs-operational-logs 的事件中心。For the Azure Activity log, you pick an Event Hubs namespace, and Azure Monitor creates an event hub within that namespace called insights-logs-operational-logs. 对于其他日志类型,可以选择现有的事件中心,或者让 Azure Monitor 为每个日志类别创建一个事件中心。For other log types, you can either choose an existing event hub or have Azure Monitor create an event hub per log category.
  • 通常,必须在使用事件中心数据的计算机或 VNET 中上打开端口 5671 和 5672。Outbound port 5671 and 5672 must typically be opened on the computer or VNET consuming data from the event hub.

提供的监视数据Monitoring data available

Azure Monitor 的监视数据源介绍了 Azure 应用程序的各种数据层,以及为每个层提供的监视数据类型。Sources of monitoring data for Azure Monitor describes the different tiers of data for Azure applications and the kinds of monitoring data available for each. 下表列出了每个层,并描述了如何将该数据流式传输到事件中心。The following table lists each of these tiers and a description of how that data can be streamed to an event hub. 可单击提供的链接了解更多详细信息。Follow the links provided for further detail.

Tier 数据Data 方法Method
Azure 订阅Azure subscription Azure 活动日志Azure Activity Log 创建日志配置文件,以将活动日志事件导出到事件中心。Create a log profile to export Activity Log events to Event Hubs. 有关详细信息,请参阅将 Azure 平台日志流式传输到 Azure 事件中心See Stream Azure platform logs to Azure Event Hubs for details.
Azure 资源Azure resources 平台指标Platform metrics
资源日志Resource logs
使用资源诊断设置将两种类型的数据发送到事件中心。Both types of data are sent to an event hub using a resource diagnostic setting. 有关详细信息,请参阅将 Azure 资源日志流式传输到事件中心See Stream Azure resource logs to an event hub for details.
操作系统(来宾)Operating system (guest) Azure 虚拟机Azure virtual machines 在 Azure 中的 Windows 和 Linux 虚拟机上安装 Azure 诊断扩展Install the Azure Diagnostics Extension on Windows and Linux virtual machines in Azure. 有关 Windows VM 的详细信息,请参阅使用事件中心流式传输热路径中的 Azure 诊断数据;有关 Linux VM 的详细信息,请参阅使用 Linux 诊断扩展监视指标和日志See Streaming Azure Diagnostics data in the hot path by using Event Hubs for details on Windows VMs and Use Linux Diagnostic Extension to monitor metrics and logs for details on Linux VMs.
应用程序代码Application code Application InsightsApplication Insights Application Insights 不提供直接方法用于将数据流式传输到事件中心。Application Insights doesn't provide a direct method to stream data to event hubs. 可以设置将 Application Insights 数据连续导出到存储帐户,然后根据使用逻辑应用手动进行流式传输中所述,使用逻辑应用将数据发送到事件中心。You can set up continuous export of the Application Insights data to a storage account and then use a Logic App to send the data to an event hub as described in Manual streaming with Logic App.

使用逻辑应用手动进行流式传输Manual streaming with Logic App

对于无法直接流式传输到事件中心的数据,可以将其写入 Azure 存储,接着使用时间触发的逻辑应用从 blob 存储中拉取数据,然后将其作为消息推送到事件中心For data that you can't directly stream to an event hub, you can write to Azure storage and then use a time-triggered Logic App that pulls data from blob storage and pushes it as a message to the event hub.

与 Azure Monitor 集成的合作伙伴工具Partner tools with Azure Monitor integration

通过 Azure Monitor 将监视数据路由到事件中心,可与外部 SIEM 和监视工具轻松集成。Routing your monitoring data to an event hub with Azure Monitor enables you to easily integrate with external SIEM and monitoring tools. 与 Azure Monitor 集成的工具示例包括:Examples of tools with Azure Monitor integration include the following:

工具Tool 在 Azure 中托管Hosted in Azure 说明Description
IBM QRadarIBM QRadar No Azure DSM 和 Azure 事件中心协议可从 IBM 支持网站下载。The Azure DSM and Azure Event Hub Protocol are available for download from the IBM support website. 可以在 QRadar DSM 配置中详细了解与 Azure 的集成。You can learn more about the integration with Azure at QRadar DSM configuration.
SplunkSplunk No 适用于 Splunk 的 Azure Monitor 加载项是在 Splunkbase 中提供的一个开源项目。The Azure Monitor Add-On for Splunk is an open source project available in Splunkbase. 相关文档已在 适用于 Splunk 的 Azure Monitor 加载项中提供。The documentation is available at Azure Monitor Addon For Splunk.

如果无法在 Splunk 实例中安装加载项(例如,如果使用代理或在 Splunk Cloud 上运行),则可以使用适用于 Splunk 的 Azure 函数(由事件中心内的新消息触发)将这些事件转发到 Splunk HTTP 事件收集器。If you cannot install an add-on in your Splunk instance, if for example you're using a proxy or running on Splunk Cloud, you can forward these events to the Splunk HTTP Event Collector using Azure Function For Splunk, which is triggered by new messages in the event hub.
SumoLogicSumoLogic No 从事件中心收集 Azure 审核应用的日志中提供了有关设置 SumoLogic,以使用事件中心数据的说明。Instructions for setting up SumoLogic to consume data from an event hub are available at Collect Logs for the Azure Audit App from Event Hub.
ArcSightArcSight No ArcSight Azure 事件中心智能连接器作为 ArcSight 智能连接器集合的一部分提供。The ArcSight Azure Event Hub smart connector is available as part of the ArcSight smart connector collection.
Syslog 服务器Syslog server No 若要将 Azure Monitor 数据直接流式传输到 syslog 服务器,可以使用基于 Azure 函数的解决方案If you want to stream Azure Monitor data directly to a syslog server, you can use a solution based on an Azure function.
LogRhythmLogRhythm No 此处提供了有关设置 LogRhythm,以从事件中心收集日志的说明。Instructions to set up LogRhythm to collect logs from an event hub are available here.
Logz.ioLogz.io Yes 有关详细信息,请参阅开始使用用于在 Azure 上运行的 Java 应用的 Logz.io 进行监视和日志记录For more information, see Getting started with monitoring and logging using Logz.io for Java apps running on Azure

后续步骤Next Steps