Databricks CLI commands

Note

This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.

Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.

This article provides information about available Databricks CLI commands. This information supplements the command line help. For more information about installing and using the Databricks CLI, see Install or update the Databricks CLI and What is the Databricks CLI?.

The Databricks CLI includes the command groups listed in the following tables. Command groups contain sets of related commands, which can also contain subcommands. To output usage and syntax information for a command group, an individual command, or subcommand:

  • databricks <command-group> -h
  • databricks <command-group> <command-name> -h
  • databricks <command-group> <command-name> <subcommand-name> -h

Many CLI commands map to operations that are documented in the Azure Databricks REST API reference.

Workspace commands

Group Description and commands
fs Commands for managing files and the file system:

cat, cp, ls, mkdir, rm
git-credentials Commands for registering personal access tokens for Databricks to do operations on behalf of the user:

create, delete, get, list, update
repos Commands for allowing users to manage their git repos:

create, delete, get, list, update

get-permission-levels, get-permissions, set-permissions, update-permissions
secrets Commands for managing secrets, secret scopes, and access permissions:

create-scope, delete-acl, delete-scope, delete-secret, get-acl, get-secret, list-acls, list-scopes, list-secrets, put-acl, put-secret
workspace Commands to list, import, export, and delete notebooks and folders in the workspace:

delete, export, export-dir, get-status, import, import-dir, list, mkdirs

get-permission-levels, get-permissions, set-permissions, update-permissions

Compute commands

Group Description and commands
cluster-policies Commands to control users' ability to configure clusters based on a set of rules:

create, delete, edit, get, list

get-permission-levels, get-permissions, set-permissions, update-permissions
clusters Commands that allow you to create, start, edit, list, terminate, and delete clusters:

change-owner, create, delete, edit, events, get, list, list-node-types, list-zones, permanent-delete,pin, resize, restart, spark-versions, start, unpin

get-permission-levels, get-permissions, set-permissions, update-permissions
global-init-scripts Commands that enable workspace administrators to configure global initialization scripts for their workspace:

create, delete, get, list, update
instance-pools Commands to create, edit, delete, and list instance pools using ready-to-use cloud instances which reduces a cluster start and auto-scaling times:

create, delete, edit, get, list

get-permission-levels, get-permissions, set-permissions, update-permissions
instance-profiles Commands to allow admins to add, list, and remove instance profiles that users can launch clusters with:

add, edit, list, remove
libraries Commands to install, uninstall, and get the status of libraries on a cluster:

all-cluster-statuses, cluster-status, install, uninstall
policy-families Commands to view available policy families:

get, list

Jobs commands

Group Description and commands
jobs Commands to manage jobs:

cancel-all-runs, cancel-run, create, delete, delete-run, export-run, get, get-run, get-run-output, list, list-runs, repair-run, reset, run-now, submit, update

get-permission-levels, get-permissions, set-permissions, update-permissions

Delta Live Tables commands

Group Description and commands
pipelines Commands to create, edit, delete, start, and view details about pipelines:

create, delete, get, get-update, list-pipeline-events, list-pipelines, list-updates, start-update, stop, update

get-permission-levels, get-permissions, set-permissions, update-permissions

Machine Learning commands

Group Description and commands
experiments Commands for interacting with experiments, which are the primary unit of organization in MLflow; all MLflow runs belong to an experiment:

create-experiment, create-run, delete-experiment, delete-run, delete-runs, delete-tag, get-by-name, get-experiment, get-history, get-runGet, list-artifacts, list-experiments, log-batch, log-inputs, log-metric, log-model, log-param, restore-experiment, restore-run, restore-runs, search-experiments, search-runs, set-experiment-tag, set-tag, update-experiment, update-run

get-permission-levels, get-permissions, set-permissions, update-permissions
model-registry Commands for the Workspace Model Registry:

approve-transition-request, create-comment, create-model, create-model-version, create-transition-request, create-webhook, delete-comment, delete-model, delete-model-tag, delete-model-version, delete-model-version-tag, delete-transition-request, delete-webhook, get-latest-versions, get-model, get-model-version,get-model-version-download-uri, list-models, list-transition-requests, list-webhooks, reject-transition-request, rename-model, search-model-versions, search-models, set-model-tag, set-model-version-tag, test-registry-webhook, transition-stage, update-comment, update-model, update-model-version, update-webhook

get-permission-levels, get-permissions, set-permissions, update-permissions

Real-time serving commands

Group Description and commands
serving-endpoints Commands to create, update, and delete model serving endpoints:

build-logs, create, delete, export-metrics, get, list, logs, patch, put, query, update-config

get-permission-levels, get-permissions, set-permissions, update-permissions

Identity and access management commands

Group Description and commands
account Commands for managing Databricks accounts:

- Identity and access: access-control, groups, service-principals, users, workspace-assignment
- Unity Catalog: metastore-assignments, metastores, storage-credentials
- Settings: ip-access-lists, network-connectivity, settings
- Provisioning: credentials, encryption-keys, networks, private-access, storage, vpc-endpoints, workspaces
- Billing: billable-usage, log-delivery
- OAuth: custom-app-integration, o-auth-published-apps, published-app-integration, service-principal-secrets
auth Commands for authentication:

describe, env, login, profiles, token
current-user Commands to retrieve information about currently authenticated user or service principal:

me
groups Commands for groups that simplify identity management, making it easier to assign access to Databricks workspace, data, and other securable objects:

create, delete, get, list, patch, update
permissions Commands to create read, write, edit, update and manage access for various users on different objects and endpoints:

get, set, update

get-permission-levels
service-principals Commands for identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms:

create, delete, get, list, patch, update
users Commands for user identities recognized by Databricks and represented by email addresses:

create, delete, get, list, patch, update

get-permission-levels, get-permissions, set-permissions, update-permissions
Group Description and commands
alerts Commands to perform operations on alerts:

create, delete, get, list, update
data-sources Commands for making new query objects:

list
queries Commands to perform operations on query definitions:

create, delete, get, list, restore, update
query-history Commands to access the history of queries through SQL warehouses:

list
warehouses Commands to manage SQL warehouses, which are a compute resource that lets you run SQL commands on data objects within Databricks SQL:

create, delete, edit, get, get-workspace-warehouse-config, list, set-workspace-warehouse-config, start, stop

get-permission-levels, get-permissions, set-permissions, update-permissions

Unity Catalog commands

Group Description and commands
artifact-allowlists Commands to manage artifact allow lists. In Databricks Runtime 13.3 and above, you can add libraries and init scripts to the allowlist in UC so that users can leverage these artifacts on compute configured with shared access mode:

get, update
catalogs Commands to manage catalogs, the first layer of Unity Catalog's three-level namespace:

create, delete, get, list, update
connections Commands to create a connection to an external data source:

create, delete, get, list, update
external-locations Commands to manage external locations, which combine a cloud storage path with a storage credential that authorizes access to the cloud storage path:

create, delete, get, list, update
functions Commands to manage User-Defined Functions (UDFs) in Unity Catalog:

create, delete, get, list, update
grants Commands to grant access to data in Unity Catalog:

get, get-effective, update
metastores Commands to manage metastores, which are the top-level container of objects in Unity Catalog:

assign, create, current, delete, get, list, summary, unassign, update, update-assignment
model-versions Commands to manage model versions. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.

delete, get, get-by-alias, list, update
online-tables Commands to manage online tables, which provide lower latency and higher QPS access to data from Delta tables:

create, delete, get
quality-monitors Commands to manage monitors, which compute and monitor data or model quality metrics for a table over time:

create, delete, get, get-refresh, list-refreshes, run-refresh, update
registered-models Commands to manage registered models. Databricks provides a hosted version of MLflow Model Registry in Unity Catalog.

create, delete, delete-alias, get, list, set-alias, update
schemas Commands to manage schemas, which are the second layer of Unity Catalog's three-level namespace:

create, delete, get, list, update
storage-credentials Commands to manage storage credentials, which are an authentication and authorization mechanism for accessing data stored on your cloud tenant:

create, delete, get, list, update, validate
system-schemas Commands to manage system schemas, which are schemas that live within the system catalog:

disable, enable, list
table-constraints Commands to manage primary key and foreign key constraints that encode relationships between fields in tables:

create, delete
tables Commands to manage tables, which resides in the third layer of Unity Catalog's three-level namespace:

delete, exists, get, list, list-summaries
volumes Commands to manage volumes, which are a Unity Catalog (UC) capability for accessing, storing, governing, organizing and processing files:

create, delete, list, read, update
workspace-bindings Commands to manage securable workspace bindings. Securables in Databricks can be configured as OPEN or ISOLATED.

get, get-bindings, update, update-bindings

Delta sharing commands

Group Description and commands
providers Commands to manage data providers, which represent the organizations who share data:

create, delete, get, list, list-shares, update
recipient-activation Commands to manage recipient activation, which is only applicable in the open sharing model where the recipient object has the TOKEN authentication type:

get-activation-url-info, retrieve-token
recipients Commands to manage recipients, which you create using :method:recipients/create to represent an organization which you want to allow access to shares:

create, delete, get, list, rotate-token, update

share-permissions
shares Commands to manage shares, which are containers instantiated with :method:shares/create:

create, delete, get, list, update

share-permissions, update-permissions

Settings commands

Group Description and commands
ip-access-lists Commands to enable admins to configure IP access lists:

create, delete, get, list, replace, update
settings Commands to allow users to manage settings at the workspace level:

automatic-cluster-update, csp-enablement, default-namespace, esm-enablement, restrict-workspace-admins
token-management Commands that enable administrators to get all tokens and delete tokens for other users:

create-obo-token, delete, get, list

get-permission-levels, get-permissions, set-permissions, update-permissions
tokens Commands to create, list, and revoke tokens that can be used to authenticate and access Databricks REST APIs:

create, delete, list
workspace-conf Commands to update workspace settings:

get-status, set-status

Developer tools commands

Group Description and commands
bundle Commands to manage Databricks Asset Bundles, which let you express your Databricks projects as code:

deploy, deployment, destroy, generate, init, open, run, schema, summary, sync, validate
sync Synchronize a local directory to a workspace directory.

Vector search commands

Group Description and commands
vector-search-endpoints Commands to manage vector search endpoints, which represent the compute resources to host vector search indexes:

create-endpoint, delete-endpoint, get-endpoint, list-endpoints
vector-search-indexes Commands to manage vector search indexes, an efficient representation of your embedding vectors that supports real-time and efficient approximate nearest neighbor (ANN) search queries:

create-index, delete-data-vector-index, delete-index, get-index, list-indexes, query-index, sync-index, upsert-data-vector-index

Dashboard commands

Group Description and commands
dashboards Commands for modifying dashboards:

create, delete, get, list, restore, update
lakeview Commands that provide specific management operations for AI/BI dashboards:

create, get, get-published, migrate, publish, trash, unpublish, update

Additional commands

Group Description and commands
api Commands to make requests to the Databricks REST API:

delete, get, head, patch, post, put
completion Commands to generate the autocompletion script for the specified shell:

bash, fish, powershell, zsh
configure Configure the Databricks CLI.
help Output usage information for any command.
labs Commands to manage Databricks Labs installations:

clear-cache, install, installed, list, show, uninstall, upgrade
version Retrieve the version of the CLI currently being used.

Global flags

The following flags are available to all Databricks CLI commands. Note that some flags do not apply to some commands. For detailed information about specific commands and their flags, see the command-line help.

Flag Description
-h or --help Display help for the Databricks CLI or the related command group or the related command.
-e or --environment string A string representing the bundle environment to use if applicable for the related command.
--log-file A string representing the to write output logs to. If this flag is not specified then the default is to write output logs to stderr.
--log-format text to write output logs to text or json to write output logs to JSON. If this flag is not specified then output logs are written as text.
--log-level A string representing the log format level. If not specified then the log format level is disabled.
-o or --output text to write output as text or json to write output as JSON. If this flag is not specified then output is written as text.
-p or --profile A string representing the named configuration profile to use within your .databrickscfg file. If this flag is not specified then the DEFAULT named profile is used if one exists. You can press Tab after --profile or -p to display a list of existing available configuration profiles to choose from instead of entering the configuration profile name manually.
--progress-format The format for progress logs to display (default (the default) or append or inplace or json).