Install the Azure Monitor Ingestion client library and the Azure Identity library. The Azure Identity library is required for the authentication used in this sample.
Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity library.
AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You may also want to replace the sample data with your own.
using Azure;
using Azure.Core;
using Azure.Identity;
using Azure.Monitor.Ingestion;
// Initialize variables
var endpoint = new Uri("https://my-url.monitor.azure.cn");
var ruleId = "dcr-00000000000000000000000000000000";
var streamName = "Custom-MyTableRawData";
// Create credential and client
var credential = new DefaultAzureCredential();
LogsIngestionClient client = new(endpoint, credential);
DateTimeOffset currentTime = DateTimeOffset.UtcNow;
// Use BinaryData to serialize instances of an anonymous type into JSON
BinaryData data = BinaryData.FromObjectAsJson(
new[] {
new
{
Time = currentTime,
Computer = "Computer1",
AdditionalContext = new
{
InstanceName = "user1",
TimeZone = "Pacific Time",
Level = 4,
CounterName = "AppMetric1",
CounterValue = 15.3
}
},
new
{
Time = currentTime,
Computer = "Computer2",
AdditionalContext = new
{
InstanceName = "user2",
TimeZone = "Central Time",
Level = 3,
CounterName = "AppMetric1",
CounterValue = 23.5
}
},
});
// Upload logs
try
{
var response = await client.UploadAsync(ruleId, streamName, RequestContent.Create(data)).ConfigureAwait(false);
if (response.IsError)
{
throw new Exception(response.ToString());
}
Console.WriteLine("Log upload completed using content upload");
}
catch (Exception ex)
{
Console.WriteLine("Upload failed with Exception: " + ex.Message);
}
// Logs can also be uploaded in a List
var entries = new List<object>();
for (int i = 0; i < 10; i++)
{
entries.Add(
new
{
Time = currentTime,
Computer = "Computer" + i.ToString(),
AdditionalContext = new
{
InstanceName = "user" + i.ToString(),
TimeZone = "Central Time",
Level = 3,
CounterName = "AppMetric1" + i.ToString(),
CounterValue = i
}
}
);
}
// Make the request
try
{
var response = await client.UploadAsync(ruleId, streamName, entries).ConfigureAwait(false);
if (response.IsError)
{
throw new Exception(response.ToString());
}
Console.WriteLine("Log upload completed using list of entries");
}
catch (Exception ex)
{
Console.WriteLine("Upload failed with Exception: " + ex.Message);
}
Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.
Use go get to install the Azure Monitor Ingestion Logs and Azure Identity client modules for Go. The Azure Identity module is required for the authentication used in this sample.
go get github.com/Azure/azure-sdk-for-go/sdk/monitor/ingestion/azlogs
go get github.com/Azure/azure-sdk-for-go/sdk/azidentity
Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity module.
AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You might also want to replace the sample data with your own.
package main
import (
"context"
"encoding/json"
"strconv"
"time"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/monitor/ingestion/azlogs"
)
// logs ingestion URI
const endpoint = "https://my-url.monitor.azure.cn"
// data collection rule (DCR) immutable ID
const ruleID = "dcr-00000000000000000000000000000000"
// stream name in the DCR that represents the destination table
const streamName = "Custom-MyTableRawData"
type Computer struct {
Time time.Time
Computer string
AdditionalContext string
}
func main() {
// creating the client using DefaultAzureCredential
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
//TODO: handle error
}
client, err := azlogs.NewClient(endpoint, cred, nil)
if err != nil {
//TODO: handle error
}
// generating logs
// logs should match the schema defined by the provided stream
var data []Computer
for i := 0; i < 10; i++ {
data = append(data, Computer{
Time: time.Now().UTC(),
Computer: "Computer" + strconv.Itoa(i),
AdditionalContext: "context",
})
}
// marshal data into []byte
logs, err := json.Marshal(data)
if err != nil {
panic(err)
}
// upload logs
_, err = client.Upload(context.TODO(), ruleID, streamName, logs, nil)
if err != nil {
//TODO: handle error
}
}
Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.
Include the Logs ingestion package and the azure-identity package from the Azure Identity library. The Azure Identity library is required for the authentication used in this sample.
Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity library.
AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You may also want to replace the sample data with your own.
Use npm to install the Azure Monitor Ingestion and Azure Identity client libraries for JavaScript. The Azure Identity library is required for the authentication used in this sample.
Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity library.
AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You might also want to replace the sample data with your own.
Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.
The following PowerShell code sends data to the endpoint by using HTTP REST fundamentals.
Note
This sample requires PowerShell v7.0 or later.
Run the following sample PowerShell command, which adds a required assembly for the script.
Add-Type -AssemblyName System.Web
Replace the parameters in the Step 0 section with values from your application and DCR. You might also want to replace the sample data in the Step 2 section with your own.
### Step 0: Set variables required for the rest of the script.
# information needed to authenticate to AAD and obtain a bearer token
$tenantId = "00000000-0000-0000-00000000000000000" #Tenant ID the data collection endpoint resides in
$appId = " 000000000-0000-0000-00000000000000000" #Application ID created and granted permissions
$appSecret = "0000000000000000000000000000000000000000" #Secret created for the application
# information needed to send data to the DCR endpoint
$endpoint_uri = "https://my-url.monitor.azure.cn" #Logs ingestion URI for the DCR
$dcrImmutableId = "dcr-00000000000000000000000000000000" #the immutableId property of the DCR object
$streamName = "Custom-MyTableRawData" #name of the stream in the DCR that represents the destination table
### Step 1: Obtain a bearer token used later to authenticate against the DCR.
$scope= [System.Web.HttpUtility]::UrlEncode("https://monitor.azure.cn//.default")
$body = "client_id=$appId&scope=$scope&client_secret=$appSecret&grant_type=client_credentials";
$headers = @{"Content-Type"="application/x-www-form-urlencoded"};
$uri = "https://login.partner.microsoftonline.cn/$tenantId/oauth2/v2.0/token"
$bearerToken = (Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers).access_token
### Step 2: Create some sample data.
$currentTime = Get-Date ([datetime]::UtcNow) -Format O
$staticData = @"
[
{
"Time": "$currentTime",
"Computer": "Computer1",
"AdditionalContext": {
"InstanceName": "user1",
"TimeZone": "Pacific Time",
"Level": 4,
"CounterName": "AppMetric1",
"CounterValue": 15.3
}
},
{
"Time": "$currentTime",
"Computer": "Computer2",
"AdditionalContext": {
"InstanceName": "user2",
"TimeZone": "Central Time",
"Level": 3,
"CounterName": "AppMetric1",
"CounterValue": 23.5
}
}
]
"@;
### Step 3: Send the data to the Log Analytics workspace.
$body = $staticData;
$headers = @{"Authorization"="Bearer $bearerToken";"Content-Type"="application/json"};
$uri = "$endpoint_uri/dataCollectionRules/$dcrImmutableId/streams/$($streamName)?api-version=2023-01-01"
$uploadResponse = Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers
Note
If you receive an Unable to find type [System.Web.HttpUtility]. error, run the last line in section 1 of the script for a fix and execute it. Executing it uncommented as part of the script won't resolve the issue. The command must be executed separately.
Execute the script, and you should see an HTTP - 204 response. The data should arrive in your Log Analytics workspace within a few minutes.
Use pip to install the Azure Monitor Ingestion and Azure Identity client libraries for Python. The Azure Identity library is required for the authentication used in this sample.
Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity library.
AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You might also want to replace the sample data in the Step 2 section with your own.
# information needed to send data to the DCR endpoint
endpoint_uri = "https://my-url.monitor.azure.cn" # logs ingestion endpoint of the DCR
dcr_immutableid = "dcr-00000000000000000000000000000000" # immutableId property of the Data Collection Rule
stream_name = "Custom-MyTableRawData" #name of the stream in the DCR that represents the destination table
# Import required modules
import os
from azure.identity import DefaultAzureCredential
from azure.monitor.ingestion import LogsIngestionClient
from azure.core.exceptions import HttpResponseError
credential = DefaultAzureCredential()
client = LogsIngestionClient(endpoint=endpoint_uri, credential=credential, logging_enable=True)
body = [
{
"Time": "2023-03-12T15:04:48.423211Z",
"Computer": "Computer1",
"AdditionalContext": {
"InstanceName": "user1",
"TimeZone": "Pacific Time",
"Level": 4,
"CounterName": "AppMetric2",
"CounterValue": 35.3
}
},
{
"Time": "2023-03-12T15:04:48.794972Z",
"Computer": "Computer2",
"AdditionalContext": {
"InstanceName": "user2",
"TimeZone": "Central Time",
"Level": 3,
"CounterName": "AppMetric2",
"CounterValue": 43.5
}
}
]
try:
client.upload(rule_id=dcr_immutableid, stream_name=stream_name, logs=body)
except HttpResponseError as e:
print(f"Upload failed: {e}")
Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.
Troubleshooting
This section describes different error conditions you might receive and how to correct them.
Script returns error code 403
Ensure that you have the correct permissions for your application to the DCR. You might also need to wait up to 30 minutes for permissions to propagate.
Script returns error code 413 or warning of TimeoutExpired with the message ReadyBody_ClientConnectionAbort in the response
The message is too large. The maximum message size is currently 1 MB per call.
Script returns error code 429
API limits have been exceeded. The limits are currently set to 500 MB of data per minute for both compressed and uncompressed data and 300,000 requests per minute. Retry after the duration listed in the Retry-After header in the response.
Script returns error code 503
Ensure that you have the correct permissions for your application to the DCR. You might also need to wait up to 30 minutes for permissions to propagate.
You don't receive an error, but data doesn't appear in the workspace
The data might take some time to be ingested, especially the first time data is being sent to a particular table. It shouldn't take longer than 15 minutes.
IntelliSense in Log Analytics doesn't recognize the new table
The cache that drives IntelliSense might take up to 24 hours to update.