August 2025

These features and Azure Databricks platform improvements were released in August 2025.

Note

The release date and content listed below only corresponds to actual deployment of the Azure Public Cloud in most case.

It provide the evolution history of Azure Databricks service on Azure Public Cloud for your reference that may not be suitable for Azure operated by 21Vianet.

Note

Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.

Interact with Azure Databricks jobs in a Power Automate flow (Public Preview)

August 29, 2025

You can now interact with existing Azure Databricks jobs within Power Automate. To do so, add an Azure Databricks job action to your flow.

See Use your Azure Databricks data to build Power Automate flows.

Track and navigate notebook runs with the new cell execution minimap

August 28, 2025

Use the cell execution minimap to track your notebook's progress at a glance. The minimap appears in the right margin and shows each cell's execution state (skipped, queued, running, success, or error). Hover to see cell details, or click to jump directly to a cell.

For information about using the cell execution minimap, see Navigate the Databricks notebook and file editor.

Admins can now manage a workspace's serverless base environments (Public Preview)

August 28, 2025

Base environments are custom environment specifications for serverless notebooks that define a serverless environment version and a set of dependencies.

Workspace admins can now create and manage the base environments available in their workspace. They can also set a default base environment for all new serverless notebooks. This allows workspace users to quickly start working in a consistent, cached environment.

Lakebase synced tables supports syncing Apache Iceberg and foreign tables in Snapshot mode

August 28, 2025

You can now create synced tables in Snapshot sync mode from Iceberg tables or foreign tables.

External MCP servers are in Beta

August 28, 2025

Users can now connect Databricks to external Model Context Protocol (MCP) servers. This allows agents to access tools outside of Databricks.

Migrate Lakeflow Declarative Pipelines from legacy publishing mode is now GA

August 28, 2025

Lakeflow Declarative Pipelines has a legacy publishing mode that only allowed publishing to a single catalog and schema. The default publishing mode enables publishing to multiple catalogs and schemas. Migration from the legacy publishing mode to the default publishing mode is now generally available.

See Enable the default publishing mode in a pipeline

Governed tags are Public Preview

August 26, 2025

You can now create governed tags to enforce consistent tagging across data assets such as catalogs, schemas, and tables. Using governed tags, admins define the allowed keys and values and control which users and groups can assign them to objects. This helps standardize metadata for data classification, cost tracking, access control, and automation.

Databricks Runtime 17.2 and Databricks Runtime 17.2 ML are in Beta

August 26, 2025

Databricks Runtime 17.2 and Databricks Runtime 17.2 ML are now in Beta. These releases include SQL and API enhancements, new migration options, and reliability and performance improvements across the platform.

See Databricks Runtime 17.2 (Beta) and Databricks Runtime 17.2 for Machine Learning (Beta) .

Selectively and atomically replace data with INSERT REPLACE USING and INSERT REPLACE ON (GA)

August 26, 2025

INSERT REPLACE USING and INSERT REPLACE ON are now generally available for Databricks Runtime 17.2. Both SQL commands replace part of the table with the result of a query.

INSERT REPLACE USING replaces rows when the USING columns compare equal under equality. INSERT REPLACE ON replaces rows when they match a user-defined condition.

See INSERT in the SQL language reference and Selectively overwrite data with Delta Lake.

OAuth token federation is now GA

August 26, 2025

OAuth token federation is now generally available. Token federation enables you to securely access Azure Databricks APIs using tokens from your identity provider (IdP). You can configure token federation policies directly in the Azure Databricks UI, or using the Azure Databricks CLI or REST API.

See Configure a federation policy.

New table property to control Delta table compression

August 26, 2025

You can now explicitly set the compression codec for a Delta table using the delta.parquet.compression.codec table property. This property ensures that all future writes to the table use the chosen codec. See Delta table properties reference.

Data type mapping update for Lakebase synced tables

August 25, 2025

For newly created synced tables, TIMESTAMP types in source tables are now mapped to TIMESTAMP WITH TIMEZONE in synced tables. In existing synced tables, TIMESTAMP types continue to map to TIMESTAMP WITHOUT TIMEZONE.

Automatic liquid clustering is now available for Lakeflow Declarative Pipelines

August 25, 2025

You can now use automatic liquid clustering with Lakeflow Declarative Pipelines. Use automatic liquid clustering with CLUSTER BY AUTO, and Databricks intelligently chooses clustering keys to optimize query performance.

See Automatic liquid clustering, create_streaming_table, table, CREATE MATERIALIZED VIEW (Lakeflow Declarative Pipelines), and CREATE STREAMING TABLE (Lakeflow Declarative Pipelines).

Budget policy support for Lakebase database instances and synced tables (Public Preview)

August 25, 2025

You can now tag a database instance and a synced table with a budget policy to attribute billing usage to specific policies. Additionally, custom tags can be added to a database instance for more granular attribution of compute usage to teams, projects, or cost centers.

Salesforce Data Cloud File Sharing connector GA

August 22, 2025

The Salesforce Data Cloud File Sharing connector is now generally available.

See Lakehouse Federation for Salesforce Data Cloud File Sharing.

Updating workspace virtual network configurations is now in Public Preview

August 22, 2025

You can now update virtual network (VNet) configurations for existing Azure Databricks workspaces. This allows you to:

  • Migrate a workspace from an Azure Databricks-managed VNet to your own VNet (VNet injection).
  • Move a VNet injection workspace to a new VNet.
  • Replace existing subnets in a VNet injection workspace.

Enhanced autocomplete for complex data types in notebooks

August 22, 2025

Notebook autocomplete now supports enhanced suggestions for complex data types including structs, maps, and arrays in SQL cells. Additionally, when referencing common table expressions (CTEs) that use SELECT *, autocomplete provides column recommendations based on the underlying table structure.

See Personalized autocomplete.

Databricks Assistant integrated with compute

August 21, 2025

Users can now chat with Databricks Assistant on some compute pages. Use the Assistant chat panel to help you create a new compute resource, pool, and policy. Or, ask the Assistant questions about the compute resource, like "Is this compute resource Unity Catalog-enabled?" or "What's the current compute policy?"

To learn more about compute resources, see Compute.

Set the run-as user for Lakeflow Declarative Pipelines

August 18, 2025

You can now change the identity that a pipeline uses to run updates and the owner of tables published by the pipeline. This feature allows you to set a service principal as the run-as identity, which is safer and more reliable than using user accounts for automated workloads. Common use cases include recovering pipelines when the original owner was deactivated and deploying pipelines with service principals as a best practice.

For information about setting the run-as user, see Set the run-as user.

Lakeflow Declarative Pipelines template in bundles in the workspace (Public Preview)

August 14, 2025

You can now easily create an ETL pipeline in a bundle in the workspace using the new Lakeflow Declarative Pipelines template project.

Azure Databricks connector in Microsoft Power Platform is now GA

August 14, 2025

The Azure Databricks connector in Microsoft Power Platform is now generally available.

See Connect to Azure Databricks from Azure Power Platform.

Token-based rate limits now available on AI Gateway

August 14, 2025

You can now configure token-based rate limits on your model serving endpoints.

Databricks-hosted models for Assistant are now GA

August 14, 2025

Databricks Assistant with Databricks-hosted models is now generally available on all cloud platforms. This version of the Assistant is fully powered by models hosted and served directly on Databricks infrastructure — at no additional cost to customers.

OpenAI GPT OSS models now support function calling

August 13, 2025

The Databricks-hosted foundation models, OpenAI GPT OSS 120B and GPT OSS 20B now support function and tool calling.

Single-node compute on standard access mode is now GA

August 12, 2025

Single node compute resources with standard access mode is now generally available. This configuration allows multiple users to share a single-node compute resource with full user isolation. Single-node compute is useful for small jobs or non-distributed workloads.

See Compute configuration reference.

Column masks now retained when replacing a table

August 12, 2025

If a column in the new table matches a column name from the original table, its existing column mask is now retained, even if no mask is specified. This change prevents accidental removal of column-level security policies during table replacement. Previously, replacing a table dropped all existing column masks, and only newly defined masks were applied.

This change affects SQL commands ([CREATE OR] REPLACE TABLE), DataFrame APIs (saveAsTable, replace, createOrReplace), and other similar update table operations.

See Manually apply row filters and column masks.

Access requests in Unity Catalog (Public Preview)

August 11, 2025

You can now enable self-service access requests in Unity Catalog by configuring access request destinations on securable objects.

Users can request access to Unity Catalog objects that they discover. These requests are sent to configured destinations, such as emails, Slack or Microsoft Teams channels, or they can be redirected to an internal access management system.

You can also enable default email destinations so requests automatically go to the catalog or object owner's email if no other destination is set. This ensures that access requests are delivered even when no destination is manually configured for an object.

See Manage access request destinations.

Disable legacy features for new workspaces (Public Preview)

August 11, 2025

A new account console setting allows account admins to disable certain legacy features on new workspaces created in their account. If set, new workspaces will not include DBFS root and mounts, the Hive metastore, no-isolation shared compute, or Databricks Runtime versions prior to 13.3 LTS. If needed, workspace admins can still enable these legacy features from their workspace settings.

See Disable access to legacy features in new workspaces.

Lakebase Public Preview enabled by default

August 11, 2025

The Lakebase: Managed Postgres OLTP Database preview is now enabled by default. Users can start creating Lakebase database instances without needing workspace admins to enable this preview first. Admins can disable the preview if needed.

See Manage Azure Databricks Previews.

ServiceNow connector GA

August 8, 2025

The fully-managed ServiceNow ingestion connector in Lakeflow Connect is now generally available.

Create external Delta tables from third-party clients (Public Preview)

August 8, 2025

You can now create Unity Catalog external tables backed by Delta Lake from external clients and systems, such as Apache Spark.

See Create external Delta tables from external clients.

Path credential vending (Public Preview)

August 8, 2025

You can now use path credential vending to grant short-lived credentials to external locations in your Unity Catalog metastore. See Unity Catalog credential vending for external system access.

Databricks ODBC driver 2.9.2

August 5, 2025

The Databricks ODBC Driver version 2.9.2 is now available for download from the ODBC driver download page.

This release includes the following fixes and new features:

  • The process name is now used as the default UserAgentEntry if the UserAgentEntry is not explicitly set.
  • Added support for Databricks domains cloud.databricks.us and cloud.databricks.mil.
  • Enhanced recognition and handling of timestamp_ntz columns across multiple data source functions including SQLGetTypeInfo, SQLColumns, and SQLColAttribute.
  • Added CRL (Certificate Revocation Lists) cache support on Windows when UseSystemTruststore is enabled.
  • Added VOID type column support, so that VOID columns are now correctly listed in SQLGetColumns.
  • Enabled OAuth Token exchange for IDPs different from the host, which allows the exchange of OAuth access tokens (including BYOT) for Databricks in-house tokens.
  • Added support for Windows Server 2025.
  • Fixed a memory leak in the driver.

This release includes upgrades to several third-party libraries:

  • OpenSSL upgraded from 3.0.15 to 3.0.16
  • libcURL upgraded from 8.11.0 to 8.12.1
  • Expat upgraded from 2.6.3 to 2.7.1

This release includes the following behavior changes:

  • The connector no longer supports Databricks Runtime version 10.4 LTS.
  • The default maximum catalog name length and maximum schema name length has been changed from 128 to 1024.
  • Ubuntu 20.04 is no longer supported.

For complete configuration information, see the Databricks ODBC Driver Guide installed with the driver download package.

Power BI Azure Databricks connector supports M2M OAuth

August 5, 2025

You can now authenticate into Power BI Desktop using Machine-to-Machine (M2M) OAuth credentials. If you are currently using personal access tokens on behalf of service principals, Azure Databricks recommends switching to the new Client credentials authentication option.

Jobs in continuous mode can now have task-level retries for failed tasks

August 11, 2025

Jobs that are set to run in continuous mode now have the option to retry individual tasks on task failure.

See Run jobs continuously.

Databricks Runtime 17.1 is now is now GA

August 1, 2025

Databricks Runtime 17.1 is now generally available. See Databricks Runtime 17.1.

AI Playground is now GA

August 1, 2025

AI Playground is now GA.

Edit mode in Assistant does multi-cell code refactoring and more

August 1, 2025

Databricks Assistant now supports edit mode, which performs reasoning and editing steps across multiple cells in a notebook. Edit mode handles complex, multi-step tasks efficiently, coordinating changes across multiple cells without requiring manual intervention or repetitive prompting.