Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
These features and Azure Databricks platform improvements were released in October 2025.
Note
The release date and content listed below only corresponds to actual deployment of the Azure Public Cloud in most case.
It provide the evolution history of Azure Databricks service on Azure Public Cloud for your reference that may not be suitable for Azure operated by 21Vianet.
Note
Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.
Context based ingress control (Beta)
October 9, 2025
Context-based ingress control is now in Beta. This enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Azure Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.
The billable usage table now records the performance mode of serverless jobs and pipelines
October 9, 2025
Billing logs now record the performance mode of serverless jobs and pipelines. The workload's performance mode is logged in the product_features.performance_target column and can include values of PERFORMANCE_OPTIMIZED, STANDARD, or null.
Databricks Runtime maintenance updates
October 7, 2025
New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.
Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS ML are in Beta
October 6, 2025
Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS ML are now in Beta, powered by Apache Spark 4.0.0. The release includes new configuration options, improved error handling, and enhanced Spark Connect support.
See Databricks Runtime 17.3 LTS (Beta) and Databricks Runtime 17.3 LTS for Machine Learning (Beta).
Partition metadata is generally available
October 6, 2025
You can now enable partition metadata logging, a partition discovery strategy for external tables registered to Unity Catalog. See Use partition metadata logging.
Delta Sharing recipients can apply row filters and column masks (GA)
October 6, 2025
Delta Sharing recipients can now apply their own row filters and columns masks on shared tables and shared foreign tables. However, Delta Sharing providers still cannot share data assets that have row-level security or column masks.
For details, see Apply row filters and column masks.
Certification status system tag is in Public Preview
October 6, 2025
You can now apply the system.certification_status governed tag to catalogs, schemas, tables, views, volumes, dashboards, registered models, and Genie Spaces to indicate whether a data asset is certified or deprecated. This improves governance, discoverability, and trust in analytics and AI workloads.
Prompt caching is now supported for Claude models
October 3, 2025
Prompt caching is now supported for Databricks-hosted Claude models. You can specify the cache_control parameter in your query requests to cache the following:
- Thinking messages content in the
messages.contentarray. - Images content blocks in the
messages.contentarray. - Tool use, results and definitions in the
toolsarray.
Notebook improvements
October 3, 2025
The following notebook improvements are now available:
The cell execution minimap now appears in the right margin of notebooks. Use the minimap to get a visual overview of your notebook's run status and quickly navigate between cells. See Cell execution minimap.
Use Databricks Assistant to help diagnose and fix environment errors, including library installation errors.
When reconnecting to serverless notebooks, sessions are automatically restored with the notebook's Python variables and Spark state. See Automated session restoration for serverless notebooks.
Pyspark authoring completion now supports
agg,withColumns,withColumnsRenamed, andfilter/whereclauses.Databricks now supports importing and exporting IPYNB notebooks up to 100 MB. Revision snapshot autosaving, manual saving, and cloning are supported for all notebooks up to 100 MB. See Notebook sizing.
When cloning and exporting notebooks, you can now choose whether to include cell outputs or not. See Manage notebook format.
Convert to Unity Catalog managed table from external table
October 2, 2025
The ALTER TABLE ... SET MANAGED command is now generally available. This command seamlessly converts Unity Catalog external tables to managed tables. It allows you to take full advantage of Unity Catalog managed table features, such as enhanced governance, reliability, and performance. See Convert an external table to a managed Unity Catalog table.
Git email identity configuration for Git folders
October 1, 2025
You can now specify a Git provider email address, separate from your username, when creating Git credentials for Databricks Git folders. This email is used as the Git author and committer identity for all commits made through Git folders, ensuring proper attribution in your Git provider and better integration with your Git account.
The email you provide becomes the GIT_AUTHOR_EMAIL and GIT_COMMITTER_EMAIL for commits, allowing Git providers to properly associate commits with your user account and display your profile information. If no email is specified, Databricks uses your Git username as the email address (legacy behavior).
See Git commit identity and email configuration.
Configure Azure virtual network service policies for storage access (Public Preview)
October 1, 2025
Use Azure virtual network service endpoint policies to filter outbound traffic from the classic compute plane, ensuring connections are only made to specific Azure Storage accounts.
New permissions for the Databricks GitHub App
October 1, 2025
If you own an Azure Databricks account with the Azure Databricks GitHub app installed, you may receive an email titled "Databricks is requesting updated permissions" from GitHub.
This is a legitimate request from Databricks. It asks you to approve a new permission that allows Azure Databricks to read your GitHub account email(s). Granting this permission will let Azure Databricks retrieve and save your primary GitHub account email to your Linked Git credential in Azure Databricks. In an upcoming feature, this will ensure that commits made from Azure Databricks are properly linked to your GitHub identity.
If you don't accept the new permission, your Linked Git credential will still authenticate with GitHub. However, future commits from this credential will not be associated with your GitHub account identity