Create an app to get data using the managed streaming ingestion client

Applies to: ✅ Azure Data Explorer

Streaming Ingestion allows writing data to Kusto with near-real-time latencies. It’s also useful when writing small amounts of data to a large number of tables, making batching inefficient.

In this article, you’ll learn how to ingest data to Kusto using the managed streaming ingestion client. You'll ingest a data stream in the form of a file or in-memory stream.

Note

Streaming ingestion is a high velocity ingestion protocol. Streaming Ingestion isn't the same as IngestFromStream. IngestFromStream is an API that takes in a memory stream and sends it for ingestion. IngestFromStream is available for all ingestion client implementations including queued and streaming ingestion.

Streaming and Managed Streaming

Kusto SDKs provide two flavors of Streaming Ingestion Clients, StreamingIngestionClient and ManagedStreamingIngestionClient where Managed Streaming has built-in retry and failover logic.

When ingesting with the ManagedStreamingIngestionClient API, failures and retries are handled automatically as follows:

  • Streaming requests that fail due to server-side size limitations are moved to queued ingestion.
  • Data that's larger than 4 MB is automatically sent to queued ingestion, regardless of format or compression.
  • Transient failure, for example throttling, are retried three times, then moved to queued ingestion.
  • Permanent failures aren't retried.

Note

If the streaming ingestion fails and the data is moved to queued ingestion, some delay is expected before the data is visible in the table.

Limitations

Data Streaming has some limitations compared to queuing data for ingestion.

  • Tags can’t be set on data.
  • Mapping can only be provided using ingestionMappingReference. Inline mapping isn't supported.
  • The payload sent in the request can’t exceed 10 MB, regardless of format or compression.
  • The ignoreFirstRecord property isn't supported for managed streaming ingestion, so ingested data must not contain a header row.

For more information, see Streaming Limitations.

Prerequisites

Before you begin

Before creating the app, the following steps are required. Each step is detailed in the following sections.

  1. Configure streaming ingestion on your Azure Data Explorer cluster.
  2. Create a Kusto table to ingest the data into.
  3. Enable the streaming ingestion policy on the table.
  4. Download the stormevent.csv sample data file containing 1,000 storm event records.

Configure streaming ingestion

To configure streaming ingestion, see Configure streaming ingestion on your Azure Data Explorer cluster. It can take several minutes for the configuration to take effect.

Create a Kusto table

Run the following commands on your database via Kusto Explorer (Desktop) or Kusto Web Explorer.

  1. Create a Table Called Storm Events
.create table MyStormEvents (StartTime:datetime, EndTime:datetime, State:string, DamageProperty:int, DamageCrops:int, Source:string, StormSummary:dynamic)

Enable the streaming ingestion policy

Enable streaming ingestion on the table or on the entire database using one of the following commands:

Table level:

.alter table <your table name> policy streamingingestion enable

Database level:


.alter database <databaseName> policy streamingingestion enable

It can take up to two minutes for the policy to take effect.

For more information about streaming policy, see Streaming ingestion policy.

Create a basic client application

Create a basic client application which connects to the Kusto Help cluster. Enter the cluster query and ingest URI and database name in the relevant variables. The app uses two clients: one for querying and one for ingestion. Each client brings up a browser window to authenticate the user.

The code sample includes a service function PrintResultAsValueList() for printing query results.

Add the Kusto libraries using the following commands:

dotnet add package Microsoft.Azure.Kusto.Data
dotnet add package Microsoft.Azure.Kusto.ingest
using System;
using Kusto.Data;
using Kusto.Data.Net.Client;
using Kusto.Ingest;
using Kusto.Data.Common;
using Microsoft.Identity.Client;
using System.Data;
using System.Text;

class Program
{
    static void Main(string[] args)
    {
        var tableName = "MyStormEvents";
        var clusterUrl = "<KustoClusterQueryURI>";
        var ingestionUrl = "<KustoClusterQueryIngestURI>";
        var databaseName = "<databaseName>";

        var clusterKcsb = new KustoConnectionStringBuilder(clusterUrl).WithAadUserPromptAuthentication();
        var ingestionKcsb = new KustoConnectionStringBuilder(ingestionUrl).WithAadUserPromptAuthentication();

        using (var kustoClient = KustoClientFactory.CreateCslQueryProvider(clusterKcsb))
        using (var ingestClient = KustoIngestFactory.CreateManagedStreamingIngestClient(clusterKcsb, ingestionKcsb))
        {          
            Console.WriteLine("Number of rows in " + tableName);
            var queryProvider = KustoClientFactory.CreateCslQueryProvider(clusterKcsb);
            var result = kustoClient.ExecuteQuery(databaseName, tableName + " | count", new ClientRequestProperties());
    
            PrintResultAsValueList(result);
        }
    }


    static void PrintResultAsValueList(IDataReader result)
    {
        var row=0;
        while (result.Read())
        {   
            row ++;
            Console.WriteLine("row:" + row.ToString() + "\t");
            for (int i = 0; i < result.FieldCount; i++)
            {
                Console.WriteLine("\t" + result.GetName(i) + " - " + result.GetValue(i));
            }
            Console.WriteLine();
        }
    }
}

Stream a file for ingestion

Use the IngestFromStorageAsync method to ingest the stormevents.csv file.

Copy stormevents.csv file to the same location as your script. Since our input is a CSV file, use Format = DataSourceFormat.csv in the ingestion properties.

Add and ingestion section using the following lines to the end of Main().

var ingestProperties = new KustoIngestionProperties(databaseName, tableName) 
    {
        Format = DataSourceFormat.csv
    };
//Ingestion section
Console.WriteLine("Ingesting data from a file");
ingestClient.IngestFromStorageAsync(".\\stormevents.csv", ingestProperties).Wait();

Let’s also query the new number of rows and the most recent row after the ingestion. Add the following lines after the ingestion command:

Console.WriteLine("Number of rows in " + tableName);
result = kustoClient.ExecuteQuery(databaseName, tableName + " | count", new ClientRequestProperties());
PrintResultAsValueList(result);

Console.WriteLine("Example line from " + tableName);
result = kustoClient.ExecuteQuery(databaseName, tableName + " | top 1 by EndTime", new ClientRequestProperties());
PrintResultAsValueList(result);

The first time you run the application the results are as follows:

Number of rows in MyStormEvents
row 1 :
         Count - 0
Ingesting data from a file
New number of rows in MyStormEvents
row 1 :
         Count - 1000
Example line from MyStormEvents
row 1 :
         StartTime - 2007-12-31 11:15:00+00:00
         EndTime - 2007-12-31 13:21:00+00:00
         State - HAWAII
         DamageProperty - 0
         DamageCrops - 0
         Source - COOP Observer
         StormSummary - {'TotalDamages': 0, 'StartTime': '2007-12-31T11:15:00.0000000Z', 'EndTime': '2007-12-31T13:21:00.0000000Z', 'Details': {'Description': 'Heavy showers caused flash flooding in the eastern part of Molokai.  Water was running over the bridge at Halawa Valley.', 'Location': 'HAWAII'}}

Stream in-memory data for ingestion

To ingest data from memory, create a stream containing the data for ingestion.

To ingest the stream from memory, call the IngestFromStreamAsync() method.

Replace the ingestion section with the following code:

// Ingestion section
Console.WriteLine("Ingesting data from memory");
var singleLine = "2018-01-26 00:00:00.0000000,2018-01-27 14:00:00.0000000,MEXICO,0,0,Unknown,'{}'";
byte[] byteArray = Encoding.UTF8.GetBytes(singleLine);
using (MemoryStream stream = new MemoryStream(byteArray))
   {
    var streamSourceOptions = new StreamSourceOptions
    {
        LeaveOpen = false
    };
    ingestClient.IngestFromStreamAsync(stream, ingestProperties, streamSourceOptions).Wait();
   }

The results are as follows:

Number of rows in MyStormEvents
row 1 :
	 Count - 1000

Ingesting data from memory

New number of rows in MyStormEvents
row 1 :
	 Count - 1001

Example line from MyStormEvents
row 1 :
	 StartTime - 2018-01-26 00:00:00+00:00
	 EndTime - 2018-01-27 14:00:00+00:00
	 State - MEXICO
	 DamageProperty - 0
	 DamageCrops - 0
	 Source - Unknown
	 StormSummary - {}

Resources