REST API 2.0 REST API 2.0

Databricks REST API 2.0 支持用于管理工作区、DBFS、群集、实例池、作业、库、用户和组、令牌以及 MLflow 试验和模型的服务。The Databricks REST API 2.0 supports services to manage your workspace, DBFS, clusters, instance pools, jobs, libraries, users and groups, tokens, and MLflow experiments and models.

本文概述了如何使用 REST API。This article provides an overview of how to use the REST API. 本文末尾列出了各个 API 参考、身份验证选项和示例的链接。Links to each API reference, authentication options, and examples are listed at the end of the article.

若要了解如何使用个人访问令牌向 REST API 进行身份验证,请参阅使用 Azure Databricks 个人访问令牌进行身份验证For information about authenticating to the REST API using personal access tokens, see Authentication using Azure Databricks personal access tokens. 有关 API 示例,请参阅 API 示例For API examples, see API examples.

若要了解如何使用 Azure Active Directory 令牌向 REST API 进行身份验证,请参阅使用 Azure Active Directory 令牌进行身份验证For information about authenticating to the REST API using Azure Active Directory tokens, see Authentication using Azure Active Directory tokens. 有关示例,请参阅使用用户的 Azure AD 访问令牌使用服务主体的 AAD 访问令牌For examples, see Use an Azure AD access token for a user and Use an Azure AD access token for a service principal.

速率限制Rate limits

Databricks REST API 支持每个工作区最多 30 个请求/秒。The Databricks REST API supports a maximum of 30 requests/second per workspace. 超过速率限制的请求会出现 429 响应状态代码Requests that exceed the rate limit will receive a 429 response status code.

分析输出Parse output

分析 JSON 输出的各个部分可能很有用。It can be useful to parse out parts of the JSON output. 在这种情况下,建议使用实用工具 jqIn these cases, we recommend that you to use the utility jq. 有关详细信息,请参阅 jq 手册For more information, see the jq Manual. 可以通过运行 brew install jq 来使用 Homebrew 在 MacOS 上安装 jqYou can install jq on MacOS using Homebrew by running brew install jq.

某些 STRING 字段(其中包含供 UI 使用的错误/描述性消息)未结构化,在编程工作流中请不要依赖这些字段的格式。Some STRING fields (which contain error/descriptive messaging intended to be consumed by the UI) are unstructured, and you should not depend on the format of these fields in programmatic workflows.

使用查询字符串来调用 GETInvoke a GET using a query string

尽管大多数 API 调用都要求必须指定 JSON 正文,但对于 GET 调用,可以指定查询字符串。While most API calls require that you specify a JSON body, for GET calls you can specify a query string.

在以下示例中,请将 <databricks-instance> 替换为 Azure Databricks 部署的工作区 URLIn the following examples, replace <databricks-instance> with the workspace URL of your Azure Databricks deployment.

若要获取群集的详细信息,请运行:To get the details for a cluster, run:

curl ... https://<databricks-instance>/api/2.0/clusters/get?cluster_id=<cluster-id>

若要列出 DBFS 根目录的内容,请运行:To list the contents of the DBFS root, run:

curl ... https://<databricks-instance>/api/2.0/dbfs/list?path=/

Runtime 版本字符串 Runtime version strings

许多 API 调用都要求必须指定 Databricks Runtime 版本字符串。Many API calls require you to specify a Databricks runtime version string. 此部分介绍了 Databricks REST API 中的版本字符串的结构。This section describes the structure of a version string in the Databricks REST API.

Databricks RuntimeDatabricks Runtime

<M>.<F>.x[-cpu][-gpu][-ml][-hls][conda]-scala<scala-version>

其中where

  • M - Databricks Runtime 主要版本M - Databricks Runtime major release
  • F - Databricks Runtime 功能版F - Databricks Runtime feature release
  • cpu - CPU 版本(只包含 -mlcpu - CPU version (with -ml only)
  • gpu - 已启用 GPUgpu - GPU-enabled
  • ml - 机器学习ml - Machine learning
  • hls - 基因组学hls - Genomics
  • conda - 带有 Conda(不再可用)conda - with Conda (no longer available)
  • scala-version - 用于编译 Spark 的 Scala 的版本:2.10、2.11 或 2.12scala-version - version of Scala used to compile Spark: 2.10, 2.11, or 2.12

例如,5.5.x-scala2.106.3.x-gpu-scala2.11For example, 5.5.x-scala2.10 and 6.3.x-gpu-scala2.11. 支持的版本终止支持历史记录表将 Databricks Runtime 版本映射到运行时中包含的 Spark 版本。The Supported releases and End-of-support history tables map Databricks Runtime versions to the Spark version contained in the runtime.

Databricks LightDatabricks Light

apache-spark.<M>.<F>.x-scala<scala-version>

其中where

  • M - Apache Spark 主要版本M - Apache Spark major release
  • F - Apache Spark 功能版F - Apache Spark feature release
  • scala-version - 用于编译 Spark 的 Scala 的版本:2.10 或 2.11scala-version - version of Scala used to compile Spark: 2.10 or 2.11

例如,apache-spark-2.4.x-scala2.11For example, apache-spark-2.4.x-scala2.11.

APIAPIs