教程:使用 .NET 5.0 通过媒体服务进行实时流式传输

在 Azure 媒体服务中,实时事件负责处理实时传送视频流内容。 直播活动提供输入终结点(引入 URL),然后由你将该终结点提供给实时编码器。 直播活动从实时编码器接收输入流,并通过一个或多个流式处理终结点使其可用于流式处理。 直播活动还提供可用于预览的预览终结点(预览 URL),并在进一步处理和传递流之前对流进行验证。

本教程演示如何使用 .NET 5.0 创建直通类型的直播活动。 在本教程中,将:

  • 下载示例应用。
  • 检查执行实时传送视频流的代码。
  • 使用 Azure Media Player 在 https://ampdemo.azureedge.net 观看事件。
  • 清理资源。

如果没有 Azure 试用版订阅,请在开始前创建一个试用版订阅

备注

尽管本教程使用了 .NET SDK 示例,但 REST APICLI 或其他受支持的 SDK 的常规步骤是相同的。

先决条件

需要以下项才能完成本教程:

用于实时传送视频流的软件还需要以下配置:

  • 一个用于广播事件的相机或设备(例如便携式计算机)。

  • 一种本地软件编码器,可对照相机流进行编码,并通过实时消息传递协议 (RTMP) 将其发送到媒体服务实时传送视频流服务。 有关详细信息,请参阅建议的本地实时编码器。 流必须为 RTMP 或“平滑流式处理” 格式。

    此示例假设你将使用 Open Broadcaster Software (OBS) Studio 将 RTMP 广播到引入终结点。 安装 OBS Studio

提示

请在继续操作之前查看使用媒体服务 v3 的实时传送视频流

下载并配置示例

使用以下命令将包含实时传送视频流 .NET 示例的 GitHub 存储库克隆到计算机:

git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git

实时传送视频流示例位于 Live 文件夹中。

打开下载的项目中的 appsettings.json。 将这些值替换为在使用 Azure CLI 访问 Azure 媒体服务 API 中获得的凭据。

请注意,还可以在项目根处使用 .env 文件格式,仅为 .NET 示例存储库中的所有项目设置一次环境变量。 只需复制 sample.env 文件,然后填写从 Azure 门户中的媒体服务 API 访问页面或从 Azure CLI 获得的信息。 将 sample.env 文件重命名为 .env 以在所有项目中使用它 。

已配置 .gitignore 文件以避免将此文件发布到分叉存储库。

重要

此示例为每个资源使用唯一的后缀。 如果取消调试操作或者中途终止应用,则最终会在帐户中有多个直播活动。

请务必停止正在运行的直播活动, 否则,将会对你“收费”!

检查执行实时传送视频流的代码

此部分研究 LiveEventWithDVR 项目的 Authentication.cs 文件和 Program.cs 文件中定义的函数。

此示例为每个资源创建唯一的后缀,因此即使在没有清理的情况下运行示例多次,也不会有名称冲突。

开始结合使用媒体服务 API 与 .NET SDK

若要开始将媒体服务 API 与 .NET 结合使用,需要创建 AzureMediaServicesClient 对象。 若要创建对象,需要提供客户端凭据以使用 Azure Active Directory 连接到 Azure。 另一个选项是使用在 GetCredentialsInteractiveAuthAsync 中实现的交互式身份验证。

public static async Task<IAzureMediaServicesClient> CreateMediaServicesClientAsync(ConfigWrapper config, bool interactive = false)
{
    ServiceClientCredentials credentials;
    if (interactive)
        credentials = await GetCredentialsInteractiveAuthAsync(config);
    else
        credentials = await GetCredentialsAsync(config);

    return new AzureMediaServicesClient(config.ArmEndpoint, credentials)
    {
        SubscriptionId = config.SubscriptionId,
    };
}

在文章开头克隆的代码中,GetCredentialsAsync 函数根据本地配置文件 (appsettings.json) 中提供的凭据或通过存储库根目录中的 .env 环境变量文件创建 ServiceClientCredentials 对象 。

private static async Task<ServiceClientCredentials> GetCredentialsAsync(ConfigWrapper config)
{
    // Use ConfidentialClientApplicationBuilder.AcquireTokenForClient to get a token using a service principal with symmetric key

    var scopes = new[] { config.ArmAadAudience + "/.default" };

    var app = ConfidentialClientApplicationBuilder.Create(config.AadClientId)
        .WithClientSecret(config.AadSecret)
        .WithAuthority(AzureCloudInstance.AzureChina, config.AadTenantId)
        .Build();

    var authResult = await app.AcquireTokenForClient(scopes)
                                             .ExecuteAsync()
                                             .ConfigureAwait(false);

    return new TokenCredentials(authResult.AccessToken, TokenType);
}

在交互式身份验证的情况下,GetCredentialsInteractiveAuthAsync 函数根据交互式身份验证和本地配置文件 (appsettings.json) 中提供的连接参数或通过存储库根目录中的 .env 环境变量文件创建 ServiceClientCredentials 对象 。 在本例中,配置或环境变量文件中均不需要 AADCLIENTID 和 AADSECRET。

private static async Task<ServiceClientCredentials> GetCredentialsInteractiveAuthAsync(ConfigWrapper config)
{
    var scopes = new[] { config.ArmAadAudience + "/user_impersonation" };

    // client application of Az Cli
    string ClientApplicationId = "04b07795-8ddb-461a-bbee-02f9e1bf7b46";

    AuthenticationResult result = null;

    IPublicClientApplication app = PublicClientApplicationBuilder.Create(ClientApplicationId)
        .WithAuthority(AzureCloudInstance.AzureChina, config.AadTenantId)
        .WithRedirectUri("http://localhost")
        .Build();

    var accounts = await app.GetAccountsAsync();

    try
    {
        result = await app.AcquireTokenSilent(scopes, accounts.FirstOrDefault()).ExecuteAsync();
    }
    catch (MsalUiRequiredException ex)
    {
        try
        {
            result = await app.AcquireTokenInteractive(scopes).ExecuteAsync();
        }
        catch (MsalException maslException)
        {
            Console.Error.WriteLine($"ERROR: MSAL interactive authentication exception with code '{maslException.ErrorCode}' and message '{maslException.Message}'.");
        }
    }
    catch (MsalException maslException)
    {
        Console.Error.WriteLine($"ERROR: MSAL silent authentication exception with code '{maslException.ErrorCode}' and message '{maslException.Message}'.");
    }

    return new TokenCredentials(result.AccessToken, TokenType);
}

创建直播活动

本部分介绍如何创建直通类型的直播活动(将 LiveEventEncodingType 设置为 None)。 有关可用类型的信息,请参阅直播活动类型。 除了直通,还可以使用实时转码活动进行 720P 或 1080P 自适应比特率云编码。

在创建直播活动时你可能想指定以下项:

  • 直播活动的引入协议。 目前支持 RTMP、RTMPS 和平滑流式处理协议。 运行实时事件或其关联的实时输出时,无法更改协议选项。 如果需要不同的协议,请为每个流式处理协议创建单独的直播活动。

  • 对引入和预览的 IP 限制。 可定义允许向该实时事件引入视频的 IP 地址。 允许的 IP 地址可以指定为以下选项之一:

    • 单个 IP 地址(例如 10.0.0.1
    • 使用 IP 地址和无类别域际路由选择 (CIDR) 子网掩码的 IP 范围(例如 10.0.0.1/22
    • 使用 IP 地址和点分十进制子网掩码的 IP 范围(例如 10.0.0.1(255.255.252.0)

    如果未指定 IP 地址并且没有规则定义,则不会允许任何 IP 地址。 若要允许任何 IP 地址,请创建一个规则并设置 0.0.0.0/0。 IP 地址必须采用以下格式之一:具有四个数字或 CIDR 地址范围的 IPv4 地址。

  • 在创建活动时自动启动它。 如果将自动启动设置为 true,则直播活动会在创建后启动。 这意味着,只要直播活动开始运行,就会开始计费。 必须显式对直播活动资源调用 Stop 操作才能停止进一步计费。 有关详细信息,请参阅实时事件状态和计费

    等待模式可用于以较低成本“已分配”状态启动直播活动,使其更快地移动到“正在运行”状态。 对于需要快速向流式处理器分发通道的热池等情况,这非常有用。

  • 静态主机名和唯一 GUID。 要使引入 URL 具有预测性且易于在基于硬件的实时编码器中维护,请将 useStaticHostname 属性设置为 true。 有关详细信息,请参阅直播活动引入 URL

Console.WriteLine($"Creating a live event named {liveEventName}");
Console.WriteLine();

// Creating the LiveEvent - the primary object for live streaming in AMS. 
// See the overview - https://docs.azure.cn/media-services/latest/live-streaming-overview

// Create the LiveEvent

// Understand the concepts of what a live event and a live output is in AMS first!
// Read the following - https://docs.azure.cn/media-services/latest/live-events-outputs-concept
// 1) Understand the billing implications for the various states
// 2) Understand the different live event types, pass-through and encoding
// 3) Understand how to use long-running async operations 
// 4) Understand the available Standby mode and how it differs from the Running Mode. 
// 5) Understand the differences between a LiveOutput and the Asset that it records to.  They are two different concepts.
//    A live output can be considered as the "tape recorder" and the Asset is the tape that is inserted into it for recording.
// 6) Understand the advanced options such as low latency support. 
//    Low Latency - https://docs.azure.cn/media-services/latest/live-event-latency

// When broadcasting to a live event, please use one of the verified on-premises live streaming encoders.
// While operating this tutorial, it is recommended to start out using OBS Studio before moving to another encoder. 

// Note: When creating a LiveEvent, you can specify allowed IP addresses in one of the following formats:                 
//      IpV4 address with 4 numbers
//      CIDR address range  

IPRange allAllowIPRange = new IPRange(
    name: "AllowAll",
    address: "0.0.0.0",
    subnetPrefixLength: 0
);

// Create the LiveEvent input IP access control object
// this will control the IP that the encoder is running on and restrict access to only that encoder IP range.
LiveEventInputAccessControl liveEventInputAccess = new LiveEventInputAccessControl
{
    Ip = new IPAccessControl(
            allow: new IPRange[]
            {
                // re-use the same range here for the sample, but in production you can lock this
                // down to the ip range for your on-premises live encoder, laptop, or device that is sending
                // the live stream
                allAllowIPRange
            }
        )

};

// Create the LiveEvent Preview IP access control object. 
// This will restrict which clients can view the preview endpoint
LiveEventPreview liveEventPreview = new LiveEventPreview
{
    AccessControl = new LiveEventPreviewAccessControl(
        ip: new IPAccessControl(
            allow: new IPRange[]
            {
                 // re-use the same range here for the sample, but in production you can lock this to the IPs of your 
                // devices that would be monitoring the live preview. 
                allAllowIPRange
            }
        )
    )
};

// To get the same ingest URL for the same LiveEvent name:
// 1. Set useStaticHostname to true so you have ingest like: 
//        rtmps://liveevent-hevc12-eventgridmediaservice-cne22.channel.media.chinacloudapi.cn:2935/live/522f9b27dd2d4b26aeb9ef8ab96c5c77           
// 2. Set the inputs:accessToken to a desired GUID string (with or without hyphen) to make it simpler to update your encoder settings

// See REST API documentation for details on each setting value
// https://docs.microsoft.com/rest/api/media/liveevents/create 

LiveEvent liveEvent = new LiveEvent(
    location: mediaService.Location,
    description: "Sample LiveEvent from .NET SDK sample",
    // Set useStaticHostname to true to make the ingest and preview URL host name the same. 
    // This can slow things down a bit. 
    useStaticHostname: true,

    // 1) Set up the input settings for the Live event...
    input: new LiveEventInput(
        streamingProtocol: LiveEventInputProtocol.RTMP,  // options are RTMP or Smooth Streaming ingest format.
                                                         // This sets a static access token for use on the ingest path. 
                                                         // Combining this with useStaticHostname:true will give you the same ingest URL on every creation.
                                                         // This is helpful when you only want to enter the URL into a single encoder one time for this Live Event name
        accessToken: "acf7b6ef-8a37-425f-b8fc-51c2d6a5a86a",  // Use this value when you want to make sure the ingest URL is static and always the same. If omitted, the service will generate a random GUID value.
        accessControl: liveEventInputAccess, // controls the IP restriction for the source encoder.
        keyFrameIntervalDuration: "PT2S" // Set this to match the ingest encoder's settings
    ),
    // 2) Set the live event to use pass-through or cloud encoding modes...
    encoding: new LiveEventEncoding(
        // Set this to Standard or Premium1080P to use the cloud live encoder.
        // Otherwise, leave as "None" to use pass-through mode
        encodingType: LiveEventEncodingType.None // also known as pass-through mode.
                                                 // OPTIONAL settings when using live cloud encoding type:
                                                 // keyFrameInterval: "PT2S", //If this value is not set for an encoding live event, the fragment duration defaults to 2 seconds. The value cannot be set for pass-through live events.
                                                 // presetName: null, // only used for custom defined presets. 
                                                 //stretchMode: "None" // can be used to determine stretch on encoder mode
    ),
    // 3) Set up the Preview endpoint for monitoring based on the settings above we already set.
    preview: liveEventPreview,
    // 4) Set up more advanced options on the live event. Low Latency is the most common one.
    streamOptions: new List<StreamOptionsFlag?>()
    {
        // Set this to Default or Low Latency
        // When using Low Latency mode, you must configure the Azure Media Player to use the 
        // quick start heuristic profile or you won't notice the change. 
        // In the AMP player client side JS options, set -  heuristicProfile: "Low Latency Heuristic Profile". 
        // To use low latency optimally, you should tune your encoder settings down to 1 second GOP size instead of 2 seconds.
        StreamOptionsFlag.LowLatency
    }
);

// Start monitoring LiveEvent events using Event Grid and Event Hub
try
{
    // Please refer README for Event Hub and storage settings.
    Console.WriteLine("Starting monitoring LiveEvent events...");
    string StorageConnectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1};EndpointSuffix=core.chinacloudapi.cn",
        config.StorageAccountName, config.StorageAccountKey);

    // Create a new host to process events from an Event Hub.
    Console.WriteLine("Creating a new host to process events from an Event Hub...");
    eventProcessorHost = new EventProcessorHost(config.EventHubName,
        PartitionReceiver.DefaultConsumerGroupName, config.EventHubConnectionString,
        StorageConnectionString, config.StorageContainerName);

    // Registers the Event Processor Host and starts receiving messages.
    await eventProcessorHost.RegisterEventProcessorFactoryAsync(new MediaServicesEventProcessorFactory(liveEventName),
        EventProcessorOptions.DefaultOptions);
}
catch (Exception e)
{
    Console.WriteLine("Failed to connect to Event Hub, please refer README for Event Hub and storage settings. Skipping event monitoring...");
    Console.WriteLine(e.Message);
}

Console.WriteLine("Creating the LiveEvent, please be patient as this can take time to complete async.");
Console.WriteLine("Live Event creation is an async operation in Azure and timing can depend on resources available.");

// When autostart is set to true, the Live Event will be started after creation. 
// That means, the billing starts as soon as the Live Event starts running. 
// You must explicitly call Stop on the Live Event resource to halt further billing.
// The following operation can sometimes take awhile. Be patient.
// On optional workflow is to first call allocate() instead of create. 
// https://docs.microsoft.com/en-us/rest/api/media/liveevents/allocate 
// This allows you to allocate the resources and place the live event into a "Standby" mode until 
// you are ready to transition to "Running". This is useful when you want to pool resources in a warm "Standby" state at a reduced cost.
// The transition from Standby to "Running" is much faster than cold creation to "Running" using the autostart property.
// Returns a long running operation polling object that can be used to poll until completion.

Stopwatch watch = Stopwatch.StartNew();
liveEvent = await client.LiveEvents.CreateAsync(
    config.ResourceGroup,
    config.AccountName,
    liveEventName,
    liveEvent,
    // When autostart is set to true, you should "await" this method operation to complete. 
    // The Live Event will be started after creation. 
    // You may choose not to do this, but create the object, and then start it using the standby state to 
    // keep the resources "warm" and billing at a lower cost until you are ready to go live. 
    // That increases the speed of startup when you are ready to go live. 
    autoStart: false);
watch.Stop();
string elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
Console.WriteLine($"Create Live Event run time : {elapsedTime}");

获取引入 URL

创建直播活动后,可以获得要提供给实时编码器的引入 URL。 编码器将使用这些 URL 来输入实时流。

// Get the RTMP ingest URL to configure in OBS Studio. 
// The endpoints is a collection of RTMP primary and secondary, and RTMPS primary and secondary URLs. 
// to get the primary secure RTMPS, it is usually going to be index 3, but you could add a loop here to confirm...
string ingestUrl = liveEvent.Input.Endpoints.First().Url;
Console.WriteLine($"The RTMP ingest URL to enter into OBS Studio is:");
Console.WriteLine($"\t{ingestUrl}");
Console.WriteLine("Make sure to enter a Stream Key into the OBS studio settings. It can be any value or you can repeat the accessToken used in the ingest URL path.");
Console.WriteLine();

获取预览 URL

使用 previewEndpoint 预览并验证是否正在接收来自编码器的输入。

重要

确保视频流向预览 URL,然后再继续操作。

// Use the previewEndpoint to preview and verify
// that the input from the encoder is actually being received
// The preview endpoint URL also support the addition of various format strings for HLS (format=m3u8-cmaf) and DASH (format=mpd-time-cmaf) for example.
// The default manifest is Smooth. 
string previewEndpoint = liveEvent.Preview.Endpoints.First().Url;
Console.WriteLine($"The preview url is:");
Console.WriteLine($"\t{previewEndpoint}");
Console.WriteLine();

Console.WriteLine($"Open the live preview in your browser and use the Azure Media Player to monitor the preview playback:");
Console.WriteLine($"\thttps://ampdemo.azureedge.net/?url={previewEndpoint}&heuristicprofile=lowlatency");
Console.WriteLine();

创建和管理直播活动与实时输出

将流传输到直播活动后,可以通过创建资产、实时输出和流式处理定位符来启动流式传输活动。 这会存档流,并使观看者可通过流式处理终结点使用该流。

在了解这些概念时,将资产对象视为过去你插入录像机的磁带会有所帮助。 实时输出是录像机。 直播活动只是进入机器后部的视频信号。

首先通过创建直播活动创建信号。 在你启动直播活动并将编码器连接到输入前,信号不会流动。

可以随时创建“磁带”。 这只是一个空资产,你会将其交给实时输出对象,即此类比中的“录像机”。

也可随时创建“录像机”。 你可以在启动信号流之前或之后创建实时输出。 如果你需要加快速度,在启动信号流之前创建输出有时很有帮助。

若要停止“磁带录像机”,请对 LiveOutput 调用 delete。 此操作不会删除“磁带”(资产)的内容。 资产始终保留存档的视频内容,直到你针对资产本身显式调用 delete

下一部分将介绍如何创建资产和实时输出。

创建资产

创建供实时输出使用的资产。 在我们的类比中,资产就是录制实时视频信号的“磁带”。 观看者将能够从此虚拟磁带以实时或点播方式查看内容。

// Create an Asset for the LiveOutput to use. Think of this as the "tape" that will be recorded to. 
// The asset entity points to a folder/container in your Azure Storage account. 
Console.WriteLine($"Creating an asset named {assetName}");
Console.WriteLine();
Asset asset = await client.Assets.CreateOrUpdateAsync(config.ResourceGroup, config.AccountName, assetName, new Asset());

创建实时输出

实时输出在创建后开始,并在删除后停止。 删除实时输出不会删除基础资产或该资产中的内容。 将其视为弹出“磁带”。 只要你愿意,含录制内容的资产可一直保留。 当它被弹出时(这意味着当实时输出被删除时),可立即点播观看。

// Create the Live Output - think of this as the "tape recorder for the live event". 
// Live outputs are optional, but are required if you want to archive the event to storage,
// use the asset for on-demand playback later, or if you want to enable cloud DVR time-shifting.
// We will use the asset created above for the "tape" to record to. 
string manifestName = "output";
Console.WriteLine($"Creating a live output named {liveOutputName}");
Console.WriteLine();

watch = Stopwatch.StartNew();
// See the REST API for details on each of the settings on Live Output
// https://docs.microsoft.com/rest/api/media/liveoutputs/create
LiveOutput liveOutput = new LiveOutput(
    assetName: asset.Name,
    manifestName: manifestName, // The HLS and DASH manifest file name. This is recommended to set if you want a deterministic manifest path up front.
                                // archive window can be set from 3 minutes to 25 hours. Content that falls outside of ArchiveWindowLength
                                // is continuously discarded from storage and is non-recoverable. For a full event archive, set to the maximum, 25 hours.
    archiveWindowLength: TimeSpan.FromHours(1)
);
liveOutput = await client.LiveOutputs.CreateAsync(
    config.ResourceGroup,
    config.AccountName,
    liveEventName,
    liveOutputName,
    liveOutput);
elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
Console.WriteLine($"Create Live Output run time : {elapsedTime}");
Console.WriteLine();

创建流式处理定位符

备注

创建媒体服务帐户后,一个处于已停止状态的默认流式处理终结点会添加到帐户。 若要开始流式传输内容并利用动态打包和动态加密,要从中流式传输内容的流式处理终结点必须处于正在运行状态。

如果已使用流式处理定位符发布了资产,则直播活动(长达 DVR 窗口长度)将继续可见,直到流式处理定位符过期或被删除(以先发生为准)。 通过这种方式,可以使虚拟“磁带”录制可供观众进行实时和点播观看。 录制完成后(当实时输出被删除时),同一 URL 可用于观看直播活动、DVR 窗口或点播资产。

Console.WriteLine($"Creating a streaming locator named {streamingLocatorName}");
Console.WriteLine();

IList<string> filters = new List<string>();
filters.Add(drvAssetFilterName);
StreamingLocator locator = await client.StreamingLocators.CreateAsync(config.ResourceGroup,
    config.AccountName,
    drvStreamingLocatorName,
    new StreamingLocator
    {
        AssetName = assetName,
        StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly,
        Filters = filters   // Associate the dvr filter with StreamingLocator.
    });

// Get the default Streaming Endpoint on the account
StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(config.ResourceGroup, config.AccountName, streamingEndpointName);

// If it's not running, Start it. 
if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
{
    Console.WriteLine("Streaming Endpoint was Stopped, restarting now..");
    await client.StreamingEndpoints.StartAsync(config.ResourceGroup, config.AccountName, streamingEndpointName);

    // Since we started the endpoint, we should stop it in cleanup.
    stopEndpoint = true;
}

// Get the URL to stream the output
ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);

foreach (StreamingPath path in paths.StreamingPaths)
{
    UriBuilder uriBuilder = new UriBuilder();
    uriBuilder.Scheme = "https";
    uriBuilder.Host = streamingEndpoint.HostName;

    uriBuilder.Path = path.Paths[0];
    // Get the URL from the uriBuilder: uriBuilder.ToString()
}

清理媒体服务帐户中的资源

如果你已完成活动的流式传输,并想要清理先前预配的资源,请使用以下过程:

  1. 停止从编码器推送流。
  2. 停止直播活动。 直播活动在停止后,不会产生任何费用。 需要重新启动它时,它会采用相同的引入 URL,因此无需重新配置编码器。
  3. 除非想要继续以点播流形式提供直播活动的存档,否则会停止流式处理终结点。 如果直播活动处于停止状态,则不会产生任何费用。
private static async Task CleanupLiveEventAndOutputAsync(IAzureMediaServicesClient client, string resourceGroup, string accountName, string liveEventName, string liveOutputName)
{
    try
    {
        LiveEvent liveEvent = await client.LiveEvents.GetAsync(resourceGroup, accountName, liveEventName);

        Console.WriteLine("Deleting Live Output");
        Stopwatch watch = Stopwatch.StartNew();

        await client.LiveOutputs.DeleteAsync(resourceGroup, accountName, liveEventName, liveOutputName);

        String elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
        Console.WriteLine($"Delete Live Output run time : {elapsedTime}");

        if (liveEvent != null)
        {
            if (liveEvent.ResourceState == LiveEventResourceState.Running)
            {
                watch = Stopwatch.StartNew();
                // If the LiveEvent is running, stop it and have it remove any LiveOutputs
                await client.LiveEvents.StopAsync(resourceGroup, accountName, liveEventName, removeOutputsOnStop: false);
                elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
                Console.WriteLine($"Stop Live Event run time : {elapsedTime}");
            }

            // Delete the LiveEvent
            await client.LiveEvents.DeleteAsync(resourceGroup, accountName, liveEventName);
        }
    }
    catch (ApiErrorException e)
    {
        Console.WriteLine("CleanupLiveEventAndOutputAsync -- Hit ApiErrorException");
        Console.WriteLine($"\tCode: {e.Body.Error.Code}");
        Console.WriteLine($"\tCode: {e.Body.Error.Message}");
        Console.WriteLine();
    }
}
private static async Task CleanupLocatorandAssetAsync(IAzureMediaServicesClient client, string resourceGroup, string accountName, string streamingLocatorName, string assetName)
{
    try
    {
        // Delete the Streaming Locator
        await client.StreamingLocators.DeleteAsync(resourceGroup, accountName, streamingLocatorName);

        // Delete the Archive Asset
        await client.Assets.DeleteAsync(resourceGroup, accountName, assetName);
    }
    catch (ApiErrorException e)
    {
        Console.WriteLine("CleanupLocatorandAssetAsync -- Hit ApiErrorException");
        Console.WriteLine($"\tCode: {e.Body.Error.Code}");
        Console.WriteLine($"\tCode: {e.Body.Error.Message}");
        Console.WriteLine();
    }
}

观看事件

若要观看活动,请复制在运行代码以创建流式处理定位符时获得的流式处理 URL。 你可以使用所选的媒体播放器。 使用 Azure Media Player 在 https://ampdemo.azureedge.net 中测试流。

直播活动在停止后会自动转换为点播内容。 即使你停止并删除了事件,只要没有删除资产,用户也能够按需将已存档内容作为视频进行流式传输。 如果事件正在使用资产,则无法将其删除;必须先删除该事件。

清理剩余资源

如果不再需要资源组中的任何一个资源(包括为本教程创建的媒体服务和存储帐户),请删除之前创建的资源组。

运行以下 CLI 命令:

az group delete --name amsResourceGroup

重要

让直播活动保持运行会产生费用。 请注意,如果项目或计划因某种原因而停止响应或关闭,则可能会导致直播活动保持运行状态,从而产生费用。

后续步骤

对文件进行流式处理