机密Secrets

机密是一种键值对,用于存储机密材料,它具有一个密钥名称,且在机密范围中是唯一的。A secret is a key-value pair that stores secret material, with a key name unique within a secret scope. 每个范围限制为 1000 个机密。Each scope is limited to 1000 secrets. 允许的最大机密值大小为 128 KB。The maximum allowed secret value size is 128 KB.

创建机密Create a secret

机密名称不区分大小写。Secret names are case insensitive.

创建机密的方法取决于使用的是 Azure Key Vault 支持的范围还是 Databricks 支持的范围。The method for creating a secret depends on whether you are using an Azure Key Vault-backed scope or a Databricks-backed scope.

在 Azure Key Vault 支持的范围创建机密Create a secret in an Azure Key Vault-backed scope

若要在 Azure Key Vault 中创建机密,请使用 Azure SetSecret REST API 或 Azure 门户 UI。To create a secret in Azure Key Vault you use the Azure SetSecret REST API or Azure portal UI.

Azure Key VaultAzure Key Vault

在 Databricks 支持的范围中创建机密Create a secret in a Databricks-backed scope

若要使用 Databricks CLI(版本 0.7.1 及更高版本)在 Databricks 支持的范围中创建机密:To create a secret in a Databricks-backed scope using the Databricks CLI (version 0.7.1 and above):

databricks secrets put --scope <scope-name> --key <key-name>

此时将打开一个编辑器,其中显示如下内容:An editor opens and displays content like this:

# ----------------------------------------------------------------------
# Do not edit the above line. Everything that follows it will be ignored.
# Please input your secret value above the line. Text will be stored in
# UTF-8 (MB4) form and any trailing new line will be stripped.
# Exit without saving will abort writing secret.

将机密值粘贴到行上方,然后保存并退出编辑器。Paste your secret value above the line and save and exit the editor. 你的输入将剥除注释并与范围中的键关联存储。Your input is stripped of the comments and stored associated with the key in the scope.

如果你使用已存在的密钥发出写入请求,则新值将覆盖现有值。If you issue a write request with a key that already exists, the new value overwrites the existing value.

还可以从文件或命令行提供机密。You can also provide a secret from a file or from the command line. 有关写入机密的详细信息,请参阅机密 CLIFor more information about writing secrets, see Secrets CLI.

列出机密List secrets

列出给定范围内的机密:To list secrets in a given scope:

databricks secrets list --scope <scope-name>

响应会显示有关机密的元数据信息,如机密密钥名称和最后一次更新时间戳(自 epoch 起算,以毫秒为单位)。The response displays metadata information about the secret, such as the secret key name and last updated at timestamp (in milliseconds since epoch). 你可以使用笔记本或作业中的机密实用程序来读取机密。You use the Secrets utilities in a notebook or job to read a secret. 例如:For example:

databricks secrets list --scope jdbc
Key name    Last updated
----------  --------------
password    1531968449039
username    1531968408097

读取机密Read a secret

你可以使用 REST API 或 CLI 创建机密,但必须使用笔记本或作业中的机密实用程序来读取机密。You create secrets using the REST API or CLI, but you must use the Secrets utilities in a notebook or job to read a secret.

Spark 配置属性和环境变量中的机密路径 Secret paths in Spark configuration properties and environment variables

重要

此功能目前以公共预览版提供。This feature is in Public Preview.

备注

在 Databricks Runtime 6.1 及更高版本中可用。Available in Databricks Runtime 6.1 and above.

可以在 Spark 配置属性或环境变量中存储机密的路径。You can store the path to a secret in a Spark configuration property or environment variable. 检索到的机密是从笔记本的输出以及 Spark 驱动程序和执行器的日志中修订的。Retrieved secrets are redacted from notebook output and Spark driver and executor logs.

重要

机密不是从 stdout 和 stderr 中修正的。Secrets are not redacted from stdout and stderr. 一种解决方法是将 Spark 配置属性 spark.databricks.acl.needAdminPermissionToViewLogs true 设置为仅允许拥有管理权限的用户查看 stdout 页面。A workaround is to set the Spark configuration property spark.databricks.acl.needAdminPermissionToViewLogs true to allow only users who have manage permission to view the stdout page.

要求和限制Requirements and limitations

  • 群集所有者需要具有机密范围内的读取权限。Cluster owners must have Read permission on the secret scope.
  • 只有群集所有者才能将路径添加到 Spark 配置或环境变量中的机密,并编辑现有的范围和名称。Only cluster owners can add a path to a secret in a Spark configuration or environment variable and edit the existing scope and name. 所有者使用放置机密 API 更改机密。Owners change a secret using the Put secret API. 必须重新启动群集才能再次提取机密。You must restart your cluster to fetch the secret again.
  • 对群集具有“可管理”权限的用户可以删除机密属性和环境变量。Users with the Can Manage permission on the cluster can delete secret properties and environment variables.

路径值Path value

Spark 属性或环境变量路径值的语法必须是 {{secrets/<scope-name>/<secret-name>}}The syntax of the Spark property or environment variable path value must be {{secrets/<scope-name>/<secret-name>}}.

该值必须以 {{secrets/ 开头,以 }} 结尾。The value must start with {{secrets/ and end with }}. 属性或环境变量的变量部分为:The variable portions of the property or environment variable are:

  • <secret-prop-name>:Spark 配置中的机密属性的名称。<secret-prop-name>: The name of the secret property in the Spark configuration.
  • <scope-name>:机密关联的范围的名称。<scope-name>: The name of the scope in which the secret is associated.
  • <secret-name>:范围中的机密的唯一名称。<secret-name>: The unique name of the secret in the scope.

备注

  • 大括号中不应有空格。There should be no spaces between the curly brackets. 如果有空格,它们将被视为范围或机密名称的一部分。If there are spaces, they are treated as part of the scope or secret name.
  • 如果值格式不正确(例如,只有一个左大括号或右大括号),则该值将被视为 Spark 配置属性或环境变量值。If the value format is incorrect, for example, there is only one starting brace or ending brace, the value is treated as a Spark configuration property or environment variable value.

在 Spark 配置属性中存储机密的路径Store the path to a secret in a Spark configuration property

按以下格式指定 Spark 配置中的机密路径:You specify a secret path in a Spark configuration in the following format:

spark.<secret-prop-name> <path-value>

spark.<secret-prop-name> 是一个映射到机密路径的 Spark 配置属性名称。spark.<secret-prop-name> is a Spark configuration property name that maps to the secret path. 只要机密属性名称是唯一的,你就可以将多个机密添加到 Spark 配置。You can add multiple secrets to the Spark configuration as long as the secret property names are unique.

示例Example

spark.password {{secrets/testScope/testKey1}}

若要获取笔记本中的机密并使用它,请运行 spark.conf.get("spark.<secret-name>")To fetch the secret in the notebook and use it, run spark.conf.get("spark.<secret-name>"):

spark.conf.get("spark.password")

在环境变量中存储机密的路径Store the path to a secret in an environment variable

指定环境变量中的机密路径,并在群集范围的初始化脚本中使用它。You specify a secret path in an environment variable and use it in a cluster-scoped init script. 无法从在 Spark 中运行的程序访问这些环境变量。These environment variables are not accessible from a program running in Spark.

SPARKPASSWORD=<path-value>

若要在初始化脚本中提取机密,请访问 $SPARKPASSWORDTo fetch the secret in an init script, access $SPARKPASSWORD:

if [[ $SPARKPASSWORD ]]; then
  use $SPARKPASSWORD
fi

删除机密Delete a secret

若要从 Azure Key Vault 支持的范围中删除机密,请使用 Azure SetSecret REST API 或 Azure 门户 UI。To delete a secret from a scope backed by Azure Key Vault, use the Azure SetSecret REST API or Azure portal UI.