June 2022
These features and Azure Databricks platform improvements were released in June 2022.
Note
The release date and content listed below only corresponds to actual deployment of the Azure Public Cloud in most case.
It provide the evolution history of Azure Databricks service on Azure Public Cloud for your reference that may not be suitable for Azure operated by 21Vianet.
Note
Releases are staged. Your Azure Databricks account may not be updated until a week or more after the initial release date.
ALTER TABLE
permission changes for Unity Catalog
June 30, 2022
In Unity Catalog there has been an update to the privileges required to run ALTER TABLE
statements. Previously, OWNERSHIP
of a table was required to run all ALTER TABLE
statements. Now OWNERSHIP
on the table is required only for changing the owner, granting permissions on the table, changing the table name, and modifying a view definition. For all other metadata operations on a table (for example updating comments, properties, or columns) you can make updates if you have the MODIFY
permission on the table.
See ALTER TABLE and ALTER TABLE.
Databricks Runtime 6.4 Extended Support reaches end of support
June 30, 2022
Support for Databricks Runtime 6.4 Extended Support ended on June 30. See Databricks support lifecycles.
Databricks Runtime 10.2 series support ends
June 22, 2022
Support for Databricks Runtime 10.2 and Databricks Runtime 10.2 for Machine Learning ended on June 22. See Databricks support lifecycles.
Databricks ODBC driver 2.6.24
June 22, 2022
We have released version 2.6.24 of the Databricks ODBC driver (download). This release adds support to configure query translation to CTAS syntax, allows users to override SQL_ATTR_QUERY_TIMEOUT
in the connector, and updates OpenSSL library.
This release also resolves the following issues:
- The connector does not allow the use of server and intermediate certificates that do not have a CRL distribution points (CDP) entry.
- When using a proxy, the connector sets the incorrect host name for SSL Server Name Indication (SNI).
Databricks Terraform provider is now GA
June 22, 2022
The Databricks Terraform provider is now generally available.
Terraform enables you to fully automate deployment for your data platforms with Terraform's existing infrastructure-as-code (IaC) processes.
You can use the Databricks Terraform provider to define assets in Azure Databricks workspaces, such as clusters and jobs, and to enforce access control through permissions for users, groups, and service principals.
The Databricks Terraform provider provides a complete audit trail of deployments. You can use the Databricks Terraform provider as a backbone for your disaster recovery and business continuity strategies.
The Databricks Terraform provider also supports Unity Catalog (Preview), allowing you to deploy this key governance feature with ease and at scale.
Databricks Runtime 11.0 and 11.0 ML are GA; 11.0 Photon is Public Preview
June 16, 2022
Databricks Runtime 11.0 and Databricks Runtime 11.0 ML are now generally available. Databricks Runtime 11.0 Photon is in Public Preview.
See Databricks Runtime 11.0 (EoS) and Databricks Runtime 11.0 for Machine Learning (EoS).
Change to Repos default working directory in Databricks Runtime 11.0
June 16, 2022
The Python working directory for notebooks in a Repo defaults to the directory containing the notebooks. For example, instead of /databricks/driver
, the default working directory is /Workspace/Repos/<user>/<repo>/<path-to-notebook>
. This allows importing and reading from Files in Repos to work by default on Databricks Runtime 11.0 clusters.
This also means that writing to the current working directory fails with a Read-only filesystem
error message. If you want to continue writing to the local file system for a cluster, write to /tmp/<filename>
or /databricks/driver/<filename>
.
Databricks Runtime 10.1 series support ends
June 14, 2022
Support for Databricks Runtime 10.1 and Databricks Runtime 10.1 for Machine Learning ended on June 14. See Databricks support lifecycles.
Delta Live Tables now supports SCD type 2
June 13-21, 2022: Version 3.74
Your Delta Live Tables pipelines can now use SCD type 2 to capture source data changes and retain the full history of updates to records. This enhances the existing Delta Live Tables support for SCD type 1. See The APPLY CHANGES APIs: Simplify change data capture with Delta Live Tables.
Create Delta Live Tables pipelines directly in the Azure Databricks UI
June 13-21, 2022: Version 3.74
You can now create a Delta Live Tables pipeline from the Create menu on the sidebar of the Azure Databricks UI.
Select the Delta Live Tables channel when you create or edit a pipeline
June 13-21, 2022: Version 3.74
You can now configure the channel for your Delta Live Tables pipeline with the Create pipeline and Edit pipeline settings dialogs. Previously, configuring the channel required editing the settings in the pipeline's JSON configuration.
Communicate between tasks in your Azure Databricks jobs with task values
June 13, 2022
You can now communicate values between tasks in your Azure Databricks jobs with task values. For example, you can use task values to pass the output of a machine learning model to downstream tasks in the same job run. See taskValues subutility (dbutils.jobs.taskValues).
Enable account switching in the Databricks UI
June 8, 2022
If users belong to more than one account, they can now switch between accounts in the Databricks UI. To use the account switcher, click your email address at the top of the Databricks UI then hover over Switch account. Then select the account you want to navigate to.