Databricks CLI migration
This article describes how to migrate from Databricks CLI version 0.18 or below to Databricks CLI version 0.205 or above. Databricks CLI versions 0.205 and above are in Public Preview.
For brevity, this article refers to Databricks CLI versions 0.18 and below as the "legacy" CLI, and Databricks CLI versions 0.205 and above as the "new" CLI.
For more information about the legacy and new CLIs, see:
- Databricks CLI (legacy) for the legacy CLI.
- What is the Databricks CLI? for the new CLI.
Uninstall the legacy CLI
If you have the legacy CLI installed and you want to uninstall it, use pip
(or pip3
, depending on your version of Python) to run the uninstall
command, as follows:
pip uninstall databricks-cli
Install the new CLI
To learn how to install the new CLI, see Install or update the Databricks CLI.
Verify your CLI installation
If you are not sure whether you are using the new CLI, follow the instructions in this section to verify and adjust as needed. Before you follow these instructions, make sure to exit any Python virtual environments, conda
environments, or similar environments.
To check the version of your default installation of the CLI, run the following command:
databricks -v
If the version number is not what you expect, do one of the following:
- If you want to use only one version of the CLI: uninstall all previous version of the CLI that you no longer want to use. You might need to update your operating sytem's
PATH
so that the path to the remaining version of the CLI that you want to use is listed. - If you want to keep using multiple versions of the CLI: prepend the full path to the version of the CLI that you want to use to each and every call to the CLI.
- If you want to keep using multiple versions of the CLI, but you do not want to keep prepending the full path to the version of the CLI that you use most often: make sure that the full path to that version is listed first in your operating system's
PATH
. Note that you must still prepend the full path to versions of the CLI that are not listed first in your operating system'sPATH
.
To update your operating system's PATH
, do the following:
MacOS or Linux
List the paths where
databricks
is installed by running one of the following commands:which -a databricks # Or: where databricks
Get the path to the installation that you want to use without prepending the full path to each and every call to the CLI. If you are not sure which path this is, run the full path to each location, followed by
-v
, for example:/usr/local/bin/databricks -v
To put the path to the installation that you want to use first in your
PATH
, run the following command, replacing/usr/local/bin
with the path that you want to use. Do not adddatabricks
to the end of this path. For example:export PATH="/usr/local/bin:$PATH"
To verify that the
PATH
was set correctly for the current terminal session, rundatabricks
followed by-v
and check the version number:databricks -v
To have the
PATH
set this way every time you restart your terminal, add the command from step 3 to your shell initialization file. For example, for Zshell, this file is typically located at~/.zshrc
. For Bash, this file is typically located at~/.bashrc
. For other shells, see your shell provider's documentation.After you update your shell initialization file, you must restart your terminal to apply the updated
PATH
value.
Windows
Right-click the installation of
databricks
that you want to use without prepending the full path to each and every call to the CLI.Click Open file location.
Note the path to
databricks
, for exampleC:\Windows
.On the Start menu, search for Environment variables.
Click Edit environment variables for your account.
Select the Path variable in the User variables for
<username>
section.Click Edit.
Click New.
Enter the path that you want to add, without
databricks.exe
(such asC:\Windows
).Use the Move Up button to move the path that you just added to the beginning of the list.
Click OK.
To verify that the
PATH
was set correctly, open a new Command Prompt, rundatabricks
followed by-v
, and check the version number:databricks -v
Use additional authentication types
The legacy CLI and the new CLI both support Azure Databricks personal access token authentication. However, Databricks recommends that you use other Azure Databricks authentication types if possible, which only the new CLI supports.
If you must use Azure Databricks personal access token authentication, Databricks recommends that you use one that is associated with a service principal instead of an Azure Databricks account or workspace user. See Manage service principals.
The new CLI supports Microsoft Entra ID tokens in addition to Azure Databricks personal access tokens. These additional tokens are more secure as they typically expire in one hour, whereas Azure Databricks personal access tokens can be valid from one day up to indefinitely. This is especially important if a token is accidentally checked into version control systems that can be accessed by others. Also, the new CLI can automatically refresh these additional tokens when they expire, whereas refreshing Azure Databricks personal access tokens is either a manual process or can be difficult to automate.
For more information, see Authentication for the Databricks CLI.
Command group and command comparisons
The following table lists the legacy CLI command groups and their new CLI command group equivalents. Where significant differences exist between CLIs, additional tables list legacy CLI commands or options and their new CLI command or option equivalents.
Command groups
Legacy command group | New command group |
---|---|
cluster-policies |
cluster-policies . All command names are the same. |
clusters |
clusters . All command names are the same. |
configure |
configure . See configure options. |
fs |
fs . See fs commands. |
groups |
groups . See groups commands. |
instance-pools |
instance-pools . All command names are the same. |
jobs |
jobs . All command names are the same. |
libraries |
libraries . All command names are the same except for list . The list command is no longer available; use the all-cluster-statuses or cluster-status commands instead. |
pipelines |
pipelines . See pipelines commands. |
repos |
repos . All command names are the same. |
runs |
jobs . See runs commands. |
secrets |
secrets . See secrets commands. |
stack |
Not available in the new CLI. Databricks recommends that you use the Databricks Terraform provider instead. |
tokens |
tokens . See tokens commands. |
unity-catalog |
Various. See unity-catalog command groups. |
workspace |
workspace . See workspace commands. |
configure
options
Legacy option | New option |
---|---|
-o |
The legacy CLI uses -o for OAuth authentication. The new CLI repurposes -o to specify whether the CLI output is in text or JSON format. Not applicable to Azure Databricks. |
--oauth |
Not applicable to Azure Databricks. |
-s or --scope |
Not applicable to Azure Databricks. |
-t or --token |
-t or --token (same) |
-f or --token-file |
Not available in the new CLI. |
--host |
--host (same) |
--aad-token |
Use --host and specify a Microsoft Entra ID token when prompted instead of an Azure Databricks personal access token. |
--insecure |
Not available in the new CLI. |
--jobs-api-version |
Not available in the new CLI. The new CLI uses the Jobs API 2.1 only. To call the legacy Jobs API 2.0 use the legacy CLI and see Jobs CLI (legacy). |
--debug |
For debugging and logging in the new CLI see Debug mode. |
--profile |
--profile (same) or -p |
-h or --help |
-h or --help (same) |
fs
commands
All fs
commands in the legacy CLI are the same in the new CLI, except for fs mv
which is not available in the new CLI.
Legacy command | New command |
---|---|
fs cat |
fs cat (same) |
fs cp |
fs cp (same) |
fs ls |
fs ls (same) |
fs mkdirs |
fs mkdir |
fs mv |
Not available in the new CLI. |
fs rm |
fs rm (same) |
groups
commands
Legacy command | New command |
---|---|
groups add-member |
groups patch |
groups create |
groups create (same) |
groups delete |
groups delete (same) |
groups list |
groups list (same) |
groups list-members |
groups list |
groups list-parents |
groups list |
groups remove-member |
groups patch |
pipelines
commands
Legacy command | New command |
---|---|
pipelines create |
pipelines create (same) |
pipelines delete |
pipelines delete (same) |
pipelines deploy |
pipelines create |
pipelines edit |
pipelines update |
pipelines get |
pipelines get (same) |
pipelines list |
pipelines list-pipeline-events or pipelines list-pipelines or pipelines list-updates |
pipelines reset |
pipelines reset (same) |
pipelines start |
pipelines start update |
pipelines stop |
pipelines stop (same) |
pipelines update |
pipelines update (same) |
runs
commands
Legacy command | New command |
---|---|
runs cancel |
jobs cancel-run |
runs get |
jobs get-run |
runs get-output |
jobs get-run-output |
runs list |
jobs list-runs |
runs submit |
jobs submit |
secrets
commands
Legacy command | New command |
---|---|
secrets create-scope |
secrets create-scope (same) |
secrets delete |
secrets delete-secret |
secrets delete-acl |
secrets delete-acl (same) |
secrets delete-scope |
secrets delete-scope (same) |
secrets get-acl |
secrets get-acl (same) |
secrets list |
secrets list-secrets |
secrets list-acls |
secrets list-acls (same) |
secrets list-scopes |
secrets list-scopes (same) |
secrets put |
secrets put-secret |
secrets put-acl |
secrets put-acl (same) |
secrets write |
secrets put-secret |
secrets write-acl |
secrets put-acl |
tokens
commands
Legacy command | New command |
---|---|
tokens create |
tokens create (same) |
tokens list |
tokens list (same) |
tokens revoke |
tokens delete |
unity-catalog
command groups
unity-catalog <command>
in the legacy CLI becomes just <command>
in the new CLI.
Legacy command group | New command group |
---|---|
unity-catalog catalogs |
catalogs (same but drop unity-catalog ) |
unity-catalog external-locations |
external-locations (same but drop unity-catalog ) |
unity-catalog lineage |
Not available in the new CLI. See Retrieve lineage using the Data Lineage REST API. |
unity-catalog metastores |
metastores (same but drop unity-catalog ) |
unity-catalog permissions |
grants |
unity-catalog providers |
providers (same but drop unity-catalog ) |
unity-catalog recipients |
recipients (same but drop unity-catalog ) |
unity-catalog schemas |
schemas (same but drop unity-catalog ) |
unity-catalog shares |
shares (same but drop unity-catalog ) |
unity-catalog storage-credentials |
storage-credentials (same but drop unity-catalog ) |
unity-catalog tables |
tables (same but drop unity-catalog ) |
workspace
commands
Legacy command | New command |
---|---|
workspace delete |
workspace delete (same) |
workspace export |
workspace export (same) |
workspace export-dir |
workspace export |
workspace import |
workspace import (same) |
workspace import-dir |
workspace import |
workspace list |
workspace list (same) |
workspace ls |
workspace list |
workspace mkdirs |
workspace mkdirs (same) |
workspace rm |
workspace delete |
Default and positional arguments
Most of the new CLI commands have at least one default argument that does not have an accompanying option. Some new CLI commands have two or more positional arguments that must be specified in a particular order and that do not have accompanying options. This is different from the legacy CLI, where most commands require options to be specified for all arguments. For example, the new CLI's clusters get
command takes a cluster ID as a default argument. However, the legacy CLI's clusers get
command requires you to specify a --cluster-id
option along with the cluster ID. For example:
For the legacy CLI:
# This works with the legacy CLI.
databricks clusters get --cluster-id 1234-567890-a1b23c4d
# This does **not** work with the legacy CLI - "Error:
# Missing None. One of ['cluster-id', 'cluster-name'] must be provided."
databricks clusters get 1234-567890-a1b23c4d
For the new CLI:
# This works with the new CLI.
databricks clusters get 1234-567890-a1b23c4d
# This does **not** work with the new CLI - "Error: unknown flag: --cluster-id"
databricks clusters get --cluster-id 1234-567890-a1b23c4d
As another example, the new CLI's grants get
command takes two default arguments: the securable type followed by the securable's full name. However, the legacy CLI's unity-catalog permissions get
command requires you to specify a --<securable-type>
option along with the securable's full name. For example:
For the legacy CLI:
databricks unity-catalog permissions get --schema main.default
For the new CLI:
# This works with the new CLI.
databricks grants get schema main.default
# This does **not** work with the new CLI - "Error: unknown flag: --schema"
databricks grants get --schema main.default
Debug mode
The legacy CLI provides a --debug
option to show full stack trace on error. For the new CLI, the --debug
option is not recognized. Instead, use the following options:
- Use
--log-file <path>
to write log information to the file specified in<path>
. If this option is not provided, log information is output to stderr. Specifying--log-file
without also specifying--log-level
results in no log information being written to the file. - Use
--log-format <type>
to specify the format of the information logged.<type>
can betext
(the default, if not specified) orjson
. - Use
--log-level <format>
to specify the level of information logged. Allowed values aredisabled
(the default, if not specified),trace
,debug
,info
,warn
, anderror
.
For the legacy CLI, the following example shows the full stack trace on error:
databricks fs ls / --debug
# Output:
#
# HTTP debugging enabled
# NoneType: None
# Error: The path / must start with "dbfs:/"
For the new CLI, the following example logs the full stack trace to a file named new-cli-errors.log
in the current working directory. The stack trace is written to the file in JSON format:
databricks fs ls / --log-file new-cli-errors.log --log-format json --log-level trace
# Output:
#
# Error: expected dbfs path (with the dbfs:/ prefix): /
#
# (The full stack trace is also written to the new-cli-errors.log file.)
Common questions
This section lists common questions about migrating from the legacy to the new CLI.
What is happening to the legacy CLI?
The legacy CLI is still available but is not receiving any non-critical updates. The legacy CLI documentation reflects this. Databricks recommends that users migrate to the new CLI as soon as possible.
The legacy CLI has always been in an Experimental state with a disclaimer that Databricks plans no new feature work for the legacy CLI, and the legacy CLI is not supported through Databricks support channels.
When will the legacy CLI be deprecated?
The legacy CLI has always been in an Experimental state with a disclaimer that Databricks plans no new feature work for the legacy CLI, and the legacy CLI is not supported through Databricks support channels.
Databricks has not established a date or timeline for deprecating the legacy CLI. However, Databricks recommends that users migrate to the new CLI as soon as possible.
When will the new CLI be released as generally available (GA)?
A release date or timeline for releasing the new CLI as GA has not been established. This will depend on feedback that Databricks receives from users during the Public Preview.
What are the key differences between the legacy and new CLIs?
- The legacy CLI was released as a Python package. The new CLI is released as a standalone executable and does not need any runtime dependencies installed.
- The new CLI has complete coverage of the Databricks REST APIs. The legacy CLI does not.
- The new CLI is available as a Public Preview. The legacy CLI remains in an Experimental state.
Does the new CLI have full feature parity with the legacy CLI?
The new CLI has coverage for almost all of the commands from the legacy CLI. However, notably absent from the new CLI is the stacks
command group in the legacy CLI. Also, a few legacy CLI command groups such as unity-catalog
and runs
have been refactored into new command groups in the new CLI. For migration guidance, see the information provided earlier in this article.
How do I migrate from the legacy to the new CLI?
For migration guidance, see the information provided earlier in this article. Note that the new CLI is not a drop-in replacement for the legacy CLI and requires some setup to move from the legacy to the new CLI.
Can installations of the legacy and new CLIs exist on the same machine?
Yes. Installations of the legacy and new CLIs can exist on the same machine, but they must be located in different directories. Because the executables are both named databricks
, you must control which executable is run by default by configuring your machine's PATH
. If you want to run the new CLI but somehow accidentally run the legacy CLI instead, by default the legacy CLI will run the new CLI with the same arguments and show the following warning message:
Databricks CLI <new-version-number> found at <new-path>
Your current PATH prefers running CLI <old-version-number> at <old-path>
Because both are installed and available in PATH,
I assume you are trying to run the newer version.
If you want to disable this behavior you can set DATABRICKS_CLI_DO_NOT_EXECUTE_NEWER_VERSION=1.
Executing CLI <new-version-number>...
-------------------------------------
Databricks CLI <new-version-number>
As shown in the preceding warning message, you can set the DATABRICKS_CLI_DO_NOT_EXECUTE_NEWER_VERSION
environment variable to 1
to disable this behavior and run the legacy CLI instead.
Get help
To get help with migrating from the legacy CLI to the new CLI, see the following resources: