Azure 存储分析日志记录Azure Storage analytics logging

存储分析记录成功和失败的存储服务请求的详细信息。Storage Analytics logs detailed information about successful and failed requests to a storage service. 可以使用该信息监视各个请求和诊断存储服务问题。This information can be used to monitor individual requests and to diagnose issues with a storage service. 将最大程度地记录请求。Requests are logged on a best-effort basis.

默认未对存储帐户启用存储分析日志记录。Storage Analytics logging is not enabled by default for your storage account. 可以在 Azure 门户中启用它;有关详细信息,请参阅在 Azure 门户中监视存储帐户You can enable it in the Azure portal; for details, see Monitor a storage account in the Azure portal. 还可以通过 REST API 或客户端库以编程方式启用存储分析。You can also enable Storage Analytics programmatically via the REST API or the client library. 使用获取 Blob 服务属性获取队列服务属性获取表服务属性操作为每个服务启用存储分析。Use the Get Blob Service Properties, Get Queue Service Properties, and Get Table Service Properties operations to enable Storage Analytics for each service.

仅在针对服务终结点发出请求时才会创建日志条目。Log entries are created only if there are requests made against the service endpoint. 例如,如果存储帐户的 Blob 终结点中存在活动,而表或队列终结点中没有该活动,则仅创建与 Blob 服务有关的日志。For example, if a storage account has activity in its Blob endpoint but not in its Table or Queue endpoints, only logs pertaining to the Blob service will be created.

备注

存储分析日志记录现仅可用于 Blob、队列和表服务。Storage Analytics logging is currently available only for the Blob, Queue, and Table services. 存储分析日志记录也可用于高性能 BlockBlobStorage 帐户。Storage Analytics logging is also available for premium-performance BlockBlobStorage accounts. 但是,它不适用于具有高级性能的常规用途 v2 帐户。However, it isn't available for general-purpose v2 accounts with premium performance.

日志记录中记录的请求Requests logged in logging

记录经过身份验证的请求Logging authenticated requests

将记录以下类型的经过身份验证的请求:The following types of authenticated requests are logged:

  • 成功的请求Successful requests

  • 失败的请求,包括超时、限制、网络、授权和其他错误Failed requests, including timeout, throttling, network, authorization, and other errors

  • 使用共享访问签名 (SAS) 或 OAuth 的请求,包括失败和成功的请求Requests using a Shared Access Signature (SAS) or OAuth, including failed and successful requests

  • 分析数据请求Requests to analytics data

    不会记录存储分析本身发出的请求,如创建或删除日志。Requests made by Storage Analytics itself, such as log creation or deletion, are not logged. 存储分析记录的操作和状态消息存储分析日志格式主题中提供了所记录数据的完整列表。A full list of the logged data is documented in the Storage Analytics Logged Operations and Status Messages and Storage Analytics Log Format topics.

记录匿名请求Logging anonymous requests

记录以下类型的匿名请求:The following types of anonymous requests are logged:

如何存储日志How logs are stored

所有日志均以块 blob 的形式存储在一个名为 $logs 的容器中,为存储帐户启用存储分析时将自动创建该容器。All logs are stored in block blobs in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. $logs 容器位于存储帐户的 blob 命名空间中,例如:http://<accountname>.blob.core.chinacloudapi.cn/$logsThe $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.chinacloudapi.cn/$logs. 在启用存储分析后,无法删除该容器,但可以删除其内容。This container cannot be deleted once Storage Analytics has been enabled, though its contents can be deleted. 如果使用存储浏览工具直接导航到容器,将看到包含日志记录数据的所有 blob。If you use your storage-browsing tool to navigate to the container directly, you will see all the blobs that contain your logging data.

备注

执行容器列出操作(例如列出容器操作)时,不会显示 $logs 容器。The $logs container is not displayed when a container listing operation is performed, such as the List Containers operation. 必须直接访问该容器。It must be accessed directly. 例如,可以使用列出 blob 操作访问 $logs 容器中的 blob。For example, you can use the List Blobs operation to access the blobs in the $logs container.

在记录请求时,存储分析将中间结果作为块进行上传。As requests are logged, Storage Analytics will upload intermediate results as blocks. 存储分析定期提交这些块,并将其作为 Blob 提供。Periodically, Storage Analytics will commit these blocks and make them available as a blob. 根据存储服务刷新日志编写器的频率,日志数据可能需要长达一小时的时间,才能显示在 $logs 容器的 blob 中。It can take up to an hour for log data to appear in the blobs in the $logs container because the frequency at which the storage service flushes the log writers. 在同一小时内创建的日志中可能存在重复的记录。Duplicate records may exist for logs created in the same hour. 可以通过检查 RequestId 和操作编号确定记录是否为重复记录。 You can determine if a record is a duplicate by checking the RequestId and Operation number.

如果你每小时都有包含多个文件的大量日志数据,则可检查 blob 元数据字段,以便使用 blob 元数据确定日志中包含的数据。If you have a high volume of log data with multiple files for each hour, then you can use the blob metadata to determine what data the log contains by examining the blob metadata fields. 这样做也是很有用的,因为将数据写入日志文件时,有时可能存在延迟:使用 blob 元数据与使用 blob 名称相比,可以更准确地指示 blob 内容。This is also useful because there can sometimes be a delay while data is written to the log files: the blob metadata gives a more accurate indication of the blob content than the blob name.

你可以使用大多数存储浏览工具查看 blob 的元数据;你也可以通过 PowerShell 或编程方式读取此信息。Most storage browsing tools enable you to view the metadata of blobs; you can also read this information using PowerShell or programmatically. 下面的 PowerShell 代码片段是一个示例,该示例说明了如何按名称筛选日志 blob 的列表来指定时间,以及如何按元数据进行筛选以仅标识包含“写入”操作的日志。The following PowerShell snippet is an example of filtering the list of log blobs by name to specify a time, and by metadata to identify just those logs that contain write operations.

Get-AzStorageBlob -Container '$logs' |  
Where-Object {  
    $_.Name -match 'table/2014/05/21/05' -and   
    $_.ICloudBlob.Metadata.LogType -match 'write'  
} |  
ForEach-Object {  
    "{0}  {1}  {2}  {3}" –f $_.Name,   
    $_.ICloudBlob.Metadata.StartTime,   
    $_.ICloudBlob.Metadata.EndTime,   
    $_.ICloudBlob.Metadata.LogType  
}  

有关以编程方式列出 blob 的信息,请参阅枚举 blob 资源以及设置并检索 blob 资源的属性和元数据For information about listing blobs programmatically, see Enumerating Blob Resources and Setting and Retrieving Properties and Metadata for Blob Resources.

日志命名约定Log naming conventions

每个日志都使用以下格式写入:Each log will be written in the following format:

<service-name>/YYYY/MM/DD/hhmm/<counter>.log

下表说明了日志名称中的每个属性:The following table describes each attribute in the log name:

AttributeAttribute 说明Description
<service-name> 存储服务的名称The name of the storage service. 例如,blobtablequeueFor example: blob, table, or queue
YYYY 用四位数表示的日志年份。The four digit year for the log. 例如: 2011For example: 2011
MM 用两位数表示的日志月份。The two digit month for the log. 例如: 07For example: 07
DD 用两位数表示的日志日期。The two digit day for the log. 例如: 31For example: 31
hh 用两位数表示的日志起始小时,采用 24 小时 UTC 格式。The two digit hour that indicates the starting hour for the logs, in 24 hour UTC format. 例如: 18For example: 18
mm 用两位数表示的日志起始分钟。The two digit number that indicates the starting minute for the logs. 注意: 最新的存储分析版本中不支持该值,其值始终为 00Note: This value is unsupported in the current version of Storage Analytics, and its value will always be 00.
<counter> 从零开始且具有六位数字的计数器,表示在 1 小时内为存储服务生成的日志 Blob 数。A zero-based counter with six digits that indicates the number of log blobs generated for the storage service in an hour time period. 此计数器从 000000 开始。This counter starts at 000000. 例如: 000001For example: 000001

下面是组合上述示例的完整示例日志名称:The following is a complete sample log name that combines the above examples:

blob/2011/07/31/1800/000001.log

下面是一个可用于访问上述日志的示例 URI:The following is a sample URI that can be used to access the above log:

https://<accountname>.blob.core.chinaloudapi.cn/$logs/blob/2011/07/31/1800/000001.log

在记录存储请求时,生成的日志名称与完成请求的操作时间(小时)关联。When a storage request is logged, the resulting log name correlates to the hour when the requested operation completed. 例如,如果在 2011 年 7 月 31 日下午 6:30 完成 GetBlob 请求,则会写入具有以下前缀的日志:blob/2011/07/31/1800/For example, if a GetBlob request was completed at 6:30PM on 7/31/2011, the log would be written with the following prefix: blob/2011/07/31/1800/

日志元数据Log metadata

所有日志 Blob 与可用于确定 Blob 包含哪些日志记录数据的元数据一起存储。All log blobs are stored with metadata that can be used to identify what logging data the blob contains. 下表说明了每个元数据属性:The following table describes each metadata attribute:

AttributeAttribute 说明Description
LogType 描述日志是否包含与读取、写入或删除操作有关的信息。Describes whether the log contains information pertaining to read, write, or delete operations. 该值可能包含一种类型,也可能包含所有三种类型的组合并用逗号隔开。This value can include one type or a combination of all three, separated by commas.

示例 1:writeExample 1: write

示例 2:read,writeExample 2: read,write

示例 3:read,write,deleteExample 3: read,write,delete
StartTime 日志中条目的最早时间,采用 YYYY-MM-DDThh:mm:ssZ 形式。The earliest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. 例如: 2011-07-31T18:21:46ZFor example: 2011-07-31T18:21:46Z
EndTime 日志中条目的最晚时间,采用 YYYY-MM-DDThh:mm:ssZ 形式。The latest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. 例如: 2011-07-31T18:22:09ZFor example: 2011-07-31T18:22:09Z
LogVersion 日志格式的版本。The version of the log format.

下表显示了使用上述示例的完整示例元数据:The following list displays complete sample metadata using the above examples:

  • LogType=write
  • StartTime=2011-07-31T18:21:46Z
  • EndTime=2011-07-31T18:22:09Z
  • LogVersion=1.0

启用存储日志记录Enable Storage logging

可以使用 Azure 门户、PowerShell 和存储 SDK 启用存储日志记录。You can enable Storage logging with Azure portal, PowerShell, and Storage SDKs.

使用 Azure 门户启用存储日志记录Enable Storage logging using the Azure portal

在 Azure 门户中,使用“诊断设置(经典)”边栏选项卡来控制存储日志记录,可从存储帐户菜单边栏选项卡的“监视(经典)”部分访问 。In the Azure portal, use the Diagnostics settings (classic) blade to control Storage Logging, accessible from the Monitoring (classic) section of a storage account's Menu blade.

可以指定要记录的存储服务,以及记录数据的保留期(天)。You can specify the storage services that you want to log, and the retention period (in days) for the logged data.

使用 PowerShell 启用存储日志记录Enable Storage logging using PowerShell

可在本地计算机上使用 PowerShell 在存储帐户中配置存储日志记录,方法是:使用 Azure PowerShell cmdlet Get-AzStorageServiceLoggingProperty 检索当前设置,使用 cmdlet Set-AzStorageServiceLoggingProperty 更改当前设置。You can use PowerShell on your local machine to configure Storage Logging in your storage account by using the Azure PowerShell cmdlet Get-AzStorageServiceLoggingProperty to retrieve the current settings, and the cmdlet Set-AzStorageServiceLoggingProperty to change the current settings.

控制存储日志记录的 cmdlet 使用 LoggingOperations 参数,该参数是一个字符串,包含要记录的请求类型的逗号分隔列表。The cmdlets that control Storage Logging use a LoggingOperations parameter that is a string containing a comma-separated list of request types to log. 三种可能的请求类型是“读取”、“写入”和“删除” 。The three possible request types are read, write, and delete. 要关闭日志记录,请对 LoggingOperations 参数使用值“无” 。To switch off logging, use the value none for the LoggingOperations parameter.

以下命令在保留期设为 5 天的情况下,在默认存储帐户中为队列服务中的读取、写入和删除请求打开日志记录:The following command switches on logging for read, write, and delete requests in the Queue service in your default storage account with retention set to five days:

Set-AzStorageServiceLoggingProperty -ServiceType Queue -LoggingOperations read,write,delete -RetentionDays 5  

以下命令在默认存储帐户中为表服务关闭日志记录:The following command switches off logging for the table service in your default storage account:

Set-AzStorageServiceLoggingProperty -ServiceType Table -LoggingOperations none  

若要了解如何配置 Azure PowerShell cmdlet 来使用 Azure 订阅并了解如何选择要使用的默认存储帐户,请参阅:如何安装和配置 Azure PowerShellFor information about how to configure the Azure PowerShell cmdlets to work with your Azure subscription and how to select the default storage account to use, see: How to install and configure Azure PowerShell.

以编程方式启用存储日志记录Enable Storage logging programmatically

除了使用 Azure 门户或 Azure PowerShell cmdlet 控制存储日志记录,你还可以使用 Azure 存储 API 之一。In addition to using the Azure portal or the Azure PowerShell cmdlets to control Storage Logging, you can also use one of the Azure Storage APIs. 例如,如果你要使用 .NET 语言,则可以使用存储客户端库。For example, if you are using a .NET language you can use the Storage Client Library.

QueueServiceClient queueServiceClient = new QueueServiceClient(connectionString);

QueueServiceProperties serviceProperties = queueServiceClient.GetProperties().Value;

serviceProperties.Logging.Delete = true;

QueueRetentionPolicy retentionPolicy = new QueueRetentionPolicy();
retentionPolicy.Enabled = true;
retentionPolicy.Days = 2;
serviceProperties.Logging.RetentionPolicy = retentionPolicy;

serviceProperties.HourMetrics = null;
serviceProperties.MinuteMetrics = null;
serviceProperties.Cors = null;

queueServiceClient.SetProperties(serviceProperties);

有关使用 .NET 语言配置存储日志记录的详细信息,请参阅存储客户端库参考For more information about using a .NET language to configure Storage Logging, see Storage Client Library Reference.

有关使用 REST API 配置存储日志记录的一般信息,请参阅启用和配置存储分析For general information about configuring Storage Logging using the REST API, see Enabling and Configuring Storage Analytics.

下载存储日志记录日志数据Download Storage logging log data

要查看和分析日志数据,应该将包含你感兴趣的日志数据的 blob 下载到本地计算机。To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. 你可以使用很多存储浏览工具从存储帐户下载 blob;你还可以使用 Azure 存储团队提供的命令行 Azure 复制工具 (AzCopy) 下载日志数据。Many storage-browsing tools enable you to download blobs from your storage account; you can also use the Azure Storage team provided command-line Azure Copy Tool AzCopy to download your log data.

备注

$logs 容器未与事件网格集成,因此在写入日志文件时不会收到通知。The $logs container isn't integrated with Event Grid, so you won't receive notifications when log files are written.

要确保下载你感兴趣的日志数据,并避免多次下载相同的日志数据,请执行以下操作:To make sure you download the log data you are interested in and to avoid downloading the same log data more than once:

  • 对包含日志数据的 blob 使用日期和时间命名约定,以跟踪已下载用于分析的 blob,从而避免多次重新下载相同的数据。Use the date and time naming convention for blobs containing log data to track which blobs you have already downloaded for analysis to avoid re-downloading the same data more than once.

  • 使用包含日志数据的 blob 中的元数据来确定特定期限,在该期限内,blob 会保留日志数据以标识需要下载的确切 blob。Use the metadata on the blobs containing log data to identify the specific period for which the blob holds log data to identify the exact blob you need to download.

要开始使用 AzCopy,请参阅 AzCopy 入门To get started with AzCopy, see Get started with AzCopy

下面的示例显示如何下载队列服务的日志数据,时间从 2014 年 5 月 20 日上午 9 点、10 点和 11 点开始。The following example shows how you can download the log data for the queue service for the hours starting at 09 AM, 10 AM, and 11 AM on 20th May, 2014.

azcopy copy 'https://mystorageaccount.blob.core.chinacloudapi.cn/$logs/queue' 'C:\Logs\Storage' --include-path '2014/05/20/09;2014/05/20/10;2014/05/20/11' --recursive

要了解有关如何下载特定文件的详细信息,请参阅下载特定文件To learn more about how to download specific files, see Download specific files.

下载日志数据后,可以查看文件中的日志条目。When you have downloaded your log data, you can view the log entries in the files. 这些日志文件使用带分隔符的文本格式,许多日志读取工具都可以分析此格式(有关详细信息,请参阅对 Azure 存储进行监视、诊断和故障排除指南)。These log files use a delimited text format that many log reading tools are able to parse (for more information, see the guide Monitoring, Diagnosing, and Troubleshooting Azure Storage). 不同的工具提供不同的功能用于筛选、排序和搜索日志文件的内容及设置其格式。Different tools have different facilities for formatting, filtering, sorting, ad searching the contents of your log files. 有关存储日志记录日志文件格式和内容的详细信息,请参阅存储分析日志格式存储分析记录的操作和状态消息For more information about the Storage Logging log file format and content, see Storage Analytics Log Format and Storage Analytics Logged Operations and Status Messages.

后续步骤Next steps