快速入门:在 Node.js 中使用 JavaScript v10 SDK 管理 blobQuickstart: Manage blobs with JavaScript v10 SDK in Node.js

本快速入门介绍如何使用 Node.js 管理 blob。In this quickstart, you learn to manage blobs by using Node.js. Blob 是可以保存大量文本或二进制数据(包括图像、文档、流媒体和存档数据)的对象。Blobs are objects that can hold large amounts of text or binary data, including images, documents, streaming media, and archive data. 你将上传、下载、列出和删除 Blob,并将管理容器。You'll upload, download, list, and delete blobs, and you'll manage containers.


下载示例应用程序Download the sample application

本快速入门中的示例应用程序是简单的 Node.js 控制台应用程序。The sample application in this quickstart is a simple Node.js console application. 若要开始,请使用以下命令将存储库克隆到计算机:To begin, clone the repository to your machine using the following command:

git clone https://github.com/Azure-Samples/azure-storage-js-v10-quickstart.git

接下来,更改应用程序的文件夹:Next, change folders for the application:

cd azure-storage-js-v10-quickstart

现在,在你最喜欢的代码编辑环境中打开该文件夹。Now, open the folder in your favorite code editing environment.

配置存储凭据Configure your storage credentials

在运行应用程序之前,必须为存储帐户提供安全凭据。Before running the application, you must provide the security credentials for your storage account. 示例存储库包括名为 .env.example 的文件。The sample repository includes a file named .env.example. 通过删除 .example 扩展名来重命名此文件,这样会生成名为 .env 的文件。Rename this file by removing the .example extension, which results in a file named .env. .env 文件中,在 AZURE_STORAGE_ACCOUNT_NAME 键和 AZURE_STORAGE_ACCOUNT_ACCESS_KEY 键的后面添加帐户名称和访问密钥值。Inside the .env file, add your account name and access key values after the AZURE_STORAGE_ACCOUNT_NAME and AZURE_STORAGE_ACCOUNT_ACCESS_KEY keys.

安装所需程序包Install required packages

在应用程序目录中运行 npm install,以安装应用程序所需的包。In the application directory, run npm install to install the required packages for the application.

npm install

运行示例Run the sample

安装依赖项以后,即可发出以下命令,以便运行示例:Now that the dependencies are installed, you can run the sample by issuing the following command:

npm start

应用的输出将类似于以下示例:The output from the app will be similar to the following example:

Container "demo" is created
 - container-one
 - container-two
 - demo
Blob "quickstart.txt" is uploaded
Local file "./readme.md" is uploaded
Blobs in "demo" container:
 - quickstart.txt
 - readme-stream.md
 - readme.md
Blob downloaded blob content: "hello!"
Blob "quickstart.txt" is deleted
Container "demo" is deleted

如果为此快速入门使用新的存储帐户,则可能只会看到标签“容器:” 下列出的“demo” 容器。If you're using a new storage account for this quickstart, then you may only see the demo container listed under the label "Containers:".

了解代码Understanding the code

该示例首先从 Azure Blob 存储命名空间中导入多个类和函数。The sample begins by importing a number of classes and functions from the Azure Blob storage namespace. 导入的每个项目都将在上下文中进行讨论,因为该示例将使用这些项目。Each of the imported items is discussed in context as they're used in the sample.

const {
} = require('@azure/storage-blob');

根据适当的上下文从环境变量中读取凭据。Credentials are read from environment variables based on the appropriate context.

if (process.env.NODE_ENV !== 'production') {

在本地运行应用以进行调试时,dotenv 模块会加载环境变量。The dotenv module loads environment variables when running the app locally for debugging. 值在名为 .env 的文件中定义,并加载到当前执行上下文中。Values are defined in a file named .env and loaded into the current execution context. 在生产环境中,服务器配置会提供这些值,这就是为什么此代码仅当脚本在“生产”环境下运行时运行的原因。In production, the server configuration provides these values, which is why this code only runs when the script is not running under a "production" environment.

导入下一批模块,以帮助与文件系统对接。The next block of modules is imported to help interface with the file system.

const fs = require('fs');
const path = require('path');

这些模块的用途如下所示:The purpose of these modules is as follows:

  • fs 是用于处理文件系统的本机 Node.js 模块fs is the native Node.js module used to work with the file system

  • path 是确定文件绝对路径所必需的,在将文件上载到 Blob 存储时使用path is required to determine the absolute path of the file, which is used when uploading a file to Blob storage

接下来,读取环境变量值并将其放在常量中。Next, environment variable values are read and set aside in constants.


下一组常量有助于揭示上载操作期间文件大小计算的意图。The next set of constants helps to reveal the intent of file size calculations during upload operations.

const ONE_MEGABYTE = 1024 * 1024;

该 API 发出的请求可以设置为在给定时间间隔后超时。Requests made by the API can be set to time out after a given interval. Aborter 类负责管理请求超时的方式,而以下常量用于定义此示例中所使用的超时。The Aborter class is responsible for managing how requests are timed-out and the following constant is used to define timeouts used in this sample.

const ONE_MINUTE = 60 * 1000;

调用代码Calling code

为了支持 JavaScript 的 async/await 语法,所有调用代码都包装在一个名为 execute 的函数中。To support JavaScript's async/await syntax, all the calling code is wrapped in a function named execute. 然后按约定调用和处理 executeThen execute is called and handled as a promise.

async function execute() {
    // commands... 

execute().then(() => console.log("Done")).catch((e) => console.log(e));

以下所有代码都在 execute 函数内 // commands... 注释所在的位置处运行。All of the following code runs inside the execute function where the // commands... comment is placed.

首先,声明相关变量以分配名称、示例内容并指向要上传到 Blob 存储的本地文件。First, the relevant variables are declared to assign names, sample content and to point to the local file to upload to Blob storage.

const containerName = "demo";
const blobName = "quickstart.txt";
const content = "hello!";
const localFilePath = "./readme.md";

帐户凭据用于创建管道,管道负责管理将请求发送到 REST API 的方式。Account credentials are used to create a pipeline, which is responsible for managing how requests are sent to the REST API. 管道是线程安全的,并指定重试策略、日志记录、HTTP 响应反序列化规则等项目的逻辑。Pipelines are thread-safe and specify logic for retry policies, logging, HTTP response deserialization rules, and more.

const credentials = new SharedKeyCredential(STORAGE_ACCOUNT_NAME, ACCOUNT_ACCESS_KEY);
const pipeline = StorageURL.newPipeline(credentials);
const serviceURL = new ServiceURL(`https://${STORAGE_ACCOUNT_NAME}.blob.core.chinacloudapi.cn`, pipeline);

此代码块中使用以下类:The following classes are used in this block of code:

  • SharedKeyCredential 类负责包装存储帐户凭据,以将其提供给请求管道。The SharedKeyCredential class is responsible for wrapping storage account credentials to provide them to a request pipeline.

  • StorageURL 类负责创建新管道。The StorageURL class is responsible for creating a new pipeline.

  • ServiceURL 对 REST API 中使用的 URL 建模。The ServiceURL models a URL used in the REST API. 此类的实例允许执行列出容器等操作,并提供上下文信息以生成容器 URL。Instances of this class allow you to perform actions like list containers and provide context information to generate container URLs.

ServiceURL 的实例与 ContainerURLBlockBlobURL 实例一起用于管理存储帐户中的容器和 blob。The instance of ServiceURL is used with the ContainerURL and BlockBlobURL instances to manage containers and blobs in your storage account.

const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, blobName);

containerURLblockBlobURL 变量在整个示例中重复使用,以便对存储帐户执行操作。The containerURL and blockBlobURL variables are reused throughout the sample to act on the storage account.

此时,存储帐户中不存在容器。At this point, the container doesn't exist in the storage account. ContainerURL 的实例表示可以对其执行操作的 URL。The instance of ContainerURL represents a URL that you can act upon. 通过使用此实例,可以创建和删除容器。By using this instance, you can create and delete the container. 此容器的位置等同于以下位置:The location of this container equates to a location such as this:


blockBlobURL 用于管理各个 blob,允许上载、下载和删除 blob 内容。The blockBlobURL is used to manage individual blobs, allowing you to upload, download, and delete blob content. 此处的 URL 与此位置类似:The URL represented here is similar to this location:


与容器一样,块 blob 尚不存在。As with the container, the block blob doesn't exist yet. 稍后将使用 blockBlobURL 变量通过上载内容来创建 blob。The blockBlobURL variable is used later to create the blob by uploading content.

使用 Aborter 类Using the Aborter class

该 API 发出的请求可以设置为在给定时间间隔后超时。Requests made by the API can be set to time out after a given interval. Aborter 类负责管理请求超时的方式。以下代码会创建一个上下文,其中一组请求指定为在 30 分钟内执行。The Aborter class is responsible for managing how requests are timed out. The following code creates a context where a set of requests is given 30 minutes to execute.

const aborter = Aborter.timeout(30 * ONE_MINUTE);

Aborter 允许通过以下方式控制请求:Aborters give you control over requests by allowing you to:

  • 指定为一批请求给定的时间量designate the amount of time given for a batch of requests
  • 指定单个请求必须在批处理中执行多长时间designate how long an individual request has to execute in the batch
  • 允许取消请求allow you to cancel requests
  • 使用 Aborter.none 静态成员来阻止所有请求一起超时use the Aborter.none static member to stop your requests from timing out all together

创建容器Create a container

若要创建容器,请使用 ContainerURLcreate 方法。To create a container, the ContainerURL's create method is used.

await containerURL.create(aborter);
console.log(`Container: "${containerName}" is created`);

由于在调用 ContainerURL.fromServiceURL(serviceURL, containerName) 时定义了容器的名称,因此只需调用 create 方法即可创建容器。As the name of the container is defined when calling ContainerURL.fromServiceURL(serviceURL, containerName), calling the create method is all that's required to create the container.

显示容器名称Show container names

帐户可以存储大量容器。Accounts can store a vast number of containers. 以下代码演示如何以分段方式列出容器,以便你循环遍历大量容器。The following code demonstrates how to list containers in a segmented fashion, which allows you to cycle through a large number of containers. showContainerNames 函数是 ServiceURLAborter 的传递实例。The showContainerNames function is passed instances of ServiceURL and Aborter.

await showContainerNames(serviceURL, aborter);

showContainerNames 函数使用 listContainersSegment 方法从存储帐户中请求多批容器名称。The showContainerNames function uses the listContainersSegment method to request batches of container names from the storage account.

async function showContainerNames(aborter, serviceURL) {
    let marker = undefined;

    do {
        const listContainersResponse = await serviceURL.listContainersSegment(aborter, marker);
        marker = listContainersResponse.nextMarker;
        for(let container of listContainersResponse.containerItems) {
            console.log(` - ${ container.name }`);
    } while (marker);

响应返回时会迭代 containerItems,以将名称记录到控制台。When the response is returned, then the containerItems are iterated to log the name to the console.

上传文本Upload text

若要向 blob 上载文本,请使用 upload 方法。To upload text to the blob, use the upload method.

await blockBlobURL.upload(aborter, content, content.length);
console.log(`Blob "${blobName}" is uploaded`);

这里将文本及其长度传递给该方法。Here the text and its length are passed into the method.

上传本地文件Upload a local file

若要向容器上载本地文件,则需要容器 URL 和该文件的路径。To upload a local file to the container, you need a container URL and the path to the file.

await uploadLocalFile(aborter, containerURL, localFilePath);
console.log(`Local file "${localFilePath}" is uploaded`);

uploadLocalFile 函数调用 uploadFileToBlockBlob 函数,以将文件路径和块 blob 的目标实例作为参数。The uploadLocalFile function calls the uploadFileToBlockBlob function, which takes the file path and an instance of the destination of the block blob as arguments.

async function uploadLocalFile(aborter, containerURL, filePath) {

    filePath = path.resolve(filePath);

    const fileName = path.basename(filePath);
    const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, fileName);

    return await uploadFileToBlockBlob(aborter, filePath, blockBlobURL);

上载流Upload a stream

还支持上载流。Uploading streams is also supported. 此示例将本地文件作为流打开,以传递给 upload 方法。This sample opens a local file as a stream to pass to the upload method.

await uploadStream(containerURL, localFilePath, aborter);
console.log(`Local file "${localFilePath}" is uploaded as a stream`);

uploadStream 函数调用 uploadStreamToBlockBlob,以将流上载到存储容器。The uploadStream function calls uploadStreamToBlockBlob to upload the stream to the storage container.

async function uploadStream(aborter, containerURL, filePath) {
    filePath = path.resolve(filePath);

    const fileName = path.basename(filePath).replace('.md', '-stream.md');
    const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, fileName);

    const stream = fs.createReadStream(filePath, {
      highWaterMark: FOUR_MEGABYTES,

    const uploadOptions = {
        bufferSize: FOUR_MEGABYTES,
        maxBuffers: 5,

    return await uploadStreamToBlockBlob(

在上载期间,uploadStreamToBlockBlob 分配缓冲区以缓存来自流的数据,以防需要重试。During an upload, uploadStreamToBlockBlob allocates buffers to cache data from the stream in case a retry is necessary. maxBuffers 值指定最多使用多少个缓冲区,因为每个缓冲区都会创建单独的上载请求。The maxBuffers value designates at most how many buffers are used as each buffer creates a separate upload request. 理想情况下,缓冲区越多意味着速度越快,但代价是内存使用率越高。Ideally, more buffers equate to higher speeds, but at the cost of higher memory usage. 当缓冲区数量多到使瓶颈转移到网络或磁盘而非客户端时,上载速度达到平衡。The upload speed plateaus when the number of buffers is high enough that the bottleneck transitions to the network or disk instead of the client.

显示 blob 名称Show blob names

正如帐户可以包含许多容器一样,每个容器都可能包含大量的 blob。Just as accounts can contain many containers, each container can potentially contain a vast amount of blobs. 可以通过 ContainerURL 类的实例访问容器中的每个 blob。Access to each blob in a container are available via an instance of the ContainerURL class.

console.log(`Blobs in "${containerName}" container:`);
await showBlobNames(aborter, containerURL);

函数 showBlobNames 调用 listBlobFlatSegment,以从容器中请求多批 blob。The function showBlobNames calls listBlobFlatSegment to request batches of blobs from the container.

async function showBlobNames(aborter, containerURL) {
    let marker = undefined;

    do {
        const listBlobsResponse = await containerURL.listBlobFlatSegment(Aborter.none, marker);
        marker = listBlobsResponse.nextMarker;
        for (const blob of listBlobsResponse.segment.blobItems) {
            console.log(` - ${ blob.name }`);
    } while (marker);

下载 BlobDownload a blob

创建 blob 后,即可使用 download 方法下载内容。Once a blob is created, you can download the contents by using the download method.

const downloadResponse = await blockBlobURL.download(aborter, 0);
const downloadedContent = await streamToString(downloadResponse.readableStreamBody);
console.log(`Downloaded blob content: "${downloadedContent}"`);

响应以流的形式返回。The response is returned as a stream. 在此示例中,使用以下 streamToString 帮助程序函数将流转换为字符串。In this example, the stream is converted to a string by using the following streamToString helper function.

// A helper method used to read a Node.js readable stream into a string
async function streamToString(readableStream) {
    return new Promise((resolve, reject) => {
      const chunks = [];
      readableStream.on("data", data => {
      readableStream.on("end", () => {
      readableStream.on("error", reject);

删除 BlobDelete a blob

BlockBlobURL 实例中的 delete 方法可从容器中删除 blob。The delete method from a BlockBlobURL instance deletes a blob from the container.

await blockBlobURL.delete(aborter)
console.log(`Block blob "${blobName}" is deleted`);

删除容器Delete a container

ContainerURL 实例中的 delete 方法可从存储帐户中删除容器。The delete method from a ContainerURL instance deletes a container from the storage account.

await containerURL.delete(aborter);
console.log(`Container "${containerName}" is deleted`);

清理资源Clean up resources

写入存储帐户的所有数据都会在代码示例结束时自动删除。All data written to the storage account is automatically deleted at the end of the code sample.

后续步骤Next steps

本快速入门演示如何使用 Node.js 管理 Azure Blob 存储中的 blob 和容器。This quickstart demonstrates how to manage blobs and containers in Azure Blob storage using Node.js. 若要详细了解如何使用此 SDK,请参阅 GitHub 存储库。To learn more about working with this SDK, refer to the GitHub repository.