Data collection in Azure Monitor

Azure Monitor has a common data platform that consolidates data from a variety of sources. Currently, different sources of data for Azure Monitor use different methods to deliver their data, and each typically require different types of configuration. Get a description of the most common data sources at Sources of monitoring data for Azure Monitor.

Azure Monitor is implementing a new ETL-like data collection pipeline that improves on legacy data collection methods. This process uses a common data ingestion pipeline for all data sources and provides a standard method of configuration that's more manageable and scalable than current methods. Specific advantages of the new data collection include the following:

  • Common set of destinations for different data sources.
  • Ability to apply a transformation to filter or modify incoming data before it's stored.
  • Consistent method for configuration of different data sources.
  • Scalable configuration options supporting infrastructure as code and DevOps processes.

When implementation is complete, all data collected by Azure Monitor will use the new data collection process and be managed by data collection rules. Currently, only certain data collection methods support the ingestion pipeline, and they may have limited configuration options. There's no difference between data collected with the new ingestion pipeline and data collected using other methods. The data is all stored together as Logs and Metrics, supporting Azure Monitor features such as log queries, alerts, and workbooks. The only difference is in the method of collection.

Data collection rules

Azure Monitor data collection is configured using a data collection rule (DCR). A DCR defines the details of a particular data collection scenario including what data should be collected, how to potentially transform that data, and where to send that data. A single DCR can be used with multiple monitored resources, giving you a consistent method to configure a variety of monitoring scenarios. In some cases, Azure Monitor will create and configure a DCR for you using options in the Azure portal. You may also directly edit DCRs to configure particular scenarios.

See Data collection rules in Azure Monitor for details on data collection rules including how to view and create them.

Transformations

One of the most valuable features of the new data collection process is data transformations, which allow you to apply a KQL query to incoming data to modify it before sending it to its destination. You might filter out unwanted data or modify existing data to improve your query or reporting capabilities.

See Data collection transformations in Azure Monitor For complete details on transformations including how to write transformation queries.

Data collection scenarios

The following sections describe the data collection scenarios that are currently supported using DCR and the new data ingestion pipeline.

Azure Monitor agent

Important

The Log Analytics agent is on a deprecation path and won't be supported after August 31, 2024. Any new data centers brought online after January 1 2024 will not support the Log Analytics agent. If you use the Log Analytics agent to ingest data to Azure Monitor, migrate to the new Azure Monitor agent prior to that date.

The diagram below shows data collection for the Azure Monitor agent running on a virtual machine. In this scenario, the DCR specifies events and performance data to collect from the agent machine, a transformation to filter and modify the data after its collected, and a Log Analytics workspace to send the transformed data. To implement this scenario, you create an association between the DCR and the agent. One agent can be associated with multiple DCRs, and one DCR can be associated with multiple agents.

Diagram showing data collection for Azure Monitor agent.

See Collect data from virtual machines with the Azure Monitor agent for details on creating a DCR for the Azure Monitor agent.

Log ingestion API

The diagram below shows data collection for the Logs ingestion API, which allows you to send data to a Log Analytics workspace from any REST client. In this scenario, the API call connects to a data collection endpoint (DCE) and specifies a DCR to accept its incoming data. The DCR understands the structure of the incoming data, includes a transformation that ensures that the data is in the format of the target table, and specifies a workspace and table to send the transformed data.

Diagram showing data collection for custom application using logs ingestion API.

See Logs ingestion API in Azure Monitor (Preview) for details on the Logs ingestion API.

Workspace transformation DCR

The diagram below shows data collection for resource logs using a workspace transformation DCR. This is a special DCR that's associated with a workspace and provides a default transformation for supported tables. This transformation is applied to any data sent to the table that doesn't use another DCR. The example here shows resource logs using a diagnostic setting, but this same transformation could be applied to other data collection methods such as Log Analytics agent or Container insights.

Diagram showing data collection for resource logs using a transformation in the workspace transformation DCR.

See Workspace transformation DCR for details about workspace transformation DCRs and links to walkthroughs for creating them.

Frequently asked questions

This section provides answers to common questions.

Is there a maximum amount of data that I can collect in Azure Monitor?

There's no limit to the amount of metric data you can collect, but this data is stored for a maximum of 93 days. See Retention of metrics. There's no limit on the amount of log data that you can collect, but the pricing tier you choose for the Log Analytics workspace might affect the limit. See Pricing details.

How do I access data collected by Azure Monitor?

Insights and solutions provide a custom experience for working with data stored in Azure Monitor. You can work directly with log data by using a log query written in Kusto Query Language (KQL). In the Azure portal, you can write and run queries and interactively analyze data by using Log Analytics. Analyze metrics in the Azure portal with the metrics explorer. See Analyze log data in Azure Monitor.

Next steps