September 2025

These features and Azure Databricks platform improvements were released in September 2025.

Note

The release date and content listed below only corresponds to actual deployment of the Azure Public Cloud in most case.

It provide the evolution history of Azure Databricks service on Azure Public Cloud for your reference that may not be consistent with the actual deployment on Azure operated by 21Vianet.

Note

Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.

Pipeline update timeline table is now available (Public Preview)

September 30, 2025

The system.lakeflow.pipeline_update_timeline table provides complete historical tracking of pipeline updates and supports fine-grained analysis of update activity, triggers, results, and compute usage.

Data classification system table Beta

September 30, 2025

A new system table, system.data_classification.results, is now available in Beta. This table captures detections of sensitive data at the column level across all enabled catalogs in your metastore.

Anthropic Claude Opus 4.1 now available as a Databricks-hosted foundation model

September 29, 2025

Mosaic AI Model Serving now supports Anthropic's Claude Opus 4.1 a Databricks-hosted foundation model. You can access this model using Foundation Model APIs pay-per-token.

Lakeflow Pipelines Editor is now in public preview

September 29, 2025

The Lakeflow Pipelines Editor (previously called the multi-file editor) is now in public preview. The Lakeflow Pipelines Editor shows a pipeline as a set of files in the pipeline assets browser. You can edit the files and control the configuration of the pipeline and which files to include in one location. This also changes the default source code format for pipelines from notebooks to Python and SQL code files.

See Develop and debug ETL pipelines with the Lakeflow Pipelines Editor.

New requirement to create connections for Salesforce ingestion

September 29, 2025

In early September 2025, Salesforce began restricting the use of uninstalled connected apps. This restriction does not break existing Unity Catalog connections to Salesforce, but it prevents you from creating a connection to a new Salesforce instance without the Databricks connected app installed.

For background, see Prepare for Connected App Usage Restrictions Change in the Salesforce documentation.

Migrate Lakeflow Declarative Pipelines pipelines from legacy publishing mode is generally available

September 24, 2025

Lakeflow Declarative Pipelines has a legacy publishing mode that only allowed publishing to a single catalog and schema. The default publishing mode enables publishing to multiple catalogs and schemas. Migration from the legacy publishing mode to the default publishing mode is now generally available (issues encountered in the previous release of this feature have been fixed).

See Enable the default publishing mode in a pipeline.

Databricks Runtime maintenance updates

September 24, 2025

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.

Mosaic AI Agent Framework supports automatic authentication passthrough for Lakebase resources

September 23, 2025

Mosaic AI Agent Framework now supports automatic authentication passthrough for Lakebase resources. This requires MLFlow 3.3.2 or above.

Route-optimized endpoints now require route-optimized URL path for querying

September 22, 2025

All newly created route-optimized endpoints must be queried using the route-optimized URL. Queries using the standard workspace URL path are not supported for route-optimized endpoints created after September 22, 2025.

Explore table data using an LLM (Public Preview)

September 22, 2025

You can now ask natural language questions about sample data using Catalog Explorer. The Assistant uses metadata context and table usage patterns to generate a SQL query. You can then validate the query and run it against the underlying table. See Explore table data using an LLM.

Databricks One Public Preview

September 17, 2025

Databricks One, a simplified user interface designed for business users, is now in Public Preview. Databricks One provides a single, intuitive entry point to interact with data and AI in Databricks, without requiring technical knowledge of compute resources, queries, models, or notebooks.

With Databricks One, business users can:

  • View and interact with AI/BI dashboards to track KPIs and analyze metrics.
  • Ask data questions in natural language using AI/BI Genie.
  • Use custom-built Databricks Apps that combine analytics, AI, and workflows.

Workspace admins can enable Databricks One from the Previews page in the admin console.

See What is Databricks One?.

Discover files in Auto Loader efficiently using file events without enrollment (Public Preview)

September 16, 2025

The cloudFiles.useManagedFileEvents option with Auto Loader is now in ungated Public Preview. This option allows you to discover files efficiently. For details, see the following:

Databricks Runtime 17.2 is now is now GA

September 16, 2025

Databricks Runtime 17.2 is now generally available. See Databricks Runtime 17.2.

Delta Sharing on Lakehouse Federation is in Beta

September 16, 2025

You can now use Delta Sharing to share foreign schemas and tables created with query federation in Databricks-to-Databricks sharing and open sharing. See Add foreign schemas or tables to a share and Read data in a shared foreign table or foreign schema.

Mount Delta shares to an existing shared catalog

September 12, 2025

Delta Sharing recipients can now mount shares received from their Delta Sharing provider to an existing shared catalog. Previously, recipients needed to create a new catalog for each new share. See Create a catalog from a share.

Python custom data sources can be used with Lakeflow Declarative Pipelines

September 10, 2025

You can use Python custom data sources and sinks in your pipeline definitions in Lakeflow Declarative Pipelines.

For information about Python custom data sources, see the following:

Automatic identity management is generally available

September 10, 2025

Automatic identity management enables you to sync users, service principals, and groups from Microsoft Entra ID into Azure Databricks without configuring an application in Microsoft Entra ID. When enabled, you can directly search in identity federated workspaces for Microsoft Entra ID users, service principals, and groups and add them to your workspace. Databricks uses Microsoft Entra ID as the source of record, so any changes to group memberships are respected in Azure Databricks. Automatic identity management also supports nested groups.

See Sync users and groups automatically from Microsoft Entra ID.

Lakeflow Declarative Pipelines now supports stream progress metrics in Public Preview

September 10, 2025

Lakeflow Declarative Pipelines now supports querying the event log for metrics about the progress of a stream. See Monitor pipeline streaming metrics.

Databricks Runtime maintenance update

September 8, 2025

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.

Databricks Apps support for Genie resources

September 8, 2025

Databricks Apps now supports adding an AI/BI Genie space as an app resource to enable natural language querying over curated datasets.

Databricks Online Feature Stores (Public Preview)

September 5, 2025

Databricks Online Feature Stores, powered by Lakebase, provide highly-scalable low-latency access to feature data while maintaining consistency with your offline feature tables. Native integrations with Unity Catalog, MLflow, and Mosaic AI Model Serving help you productionize your model endpoints, agents, and rule engines, so they can automatically and securely access features from Online Feature Stores while maintaining high performance.

MLflow metadata is now available in system tables (Public Preview)

September 5, 2025

MLflow metadata is now available in system tables. View metadata managed within the MLflow tracking service from the entire workspace in one central location, taking advantage of all the lakehouse tooling Databricks offers, such as building custom AI/BI dashboards, SQL alerts, and large scale data analytic queries.

Databricks Assistant Agent Mode: Data Science Agent is in Beta

September 3, 2025

Agent Mode for Databricks Assistant is now in Beta. In Agent Mode, the Assistant can orchestrate multi-step workflows from a single prompt.

The Data Science Agent is custom-built for data science workflows and can build an entire notebook for tasks like EDA, forecasting, and machine learning from scratch. Using your prompt, it can plan a solution, retrieve relevant assets, run code, use cell outputs to improve results, fix errors automatically, and more.

Tables backed by default storage can be Delta shared to any recipient (Beta)

September 2, 2025

Delta Sharing providers can now share tables backed by default storage with any recipient, including both open and Azure Databricks recipients—even if the recipient is using classic compute. Tables with partitioning enabled are an exception.

Migrate Lakeflow Declarative Pipelines from legacy publishing mode is rolled back to Public Preview

September 2, 2025

Lakeflow Declarative Pipelines includes a legacy publishing mode that previously limited publishing to a single catalog and schema. Default publishing mode enables publishing to multiple catalogs and schemas. A feature, recently released as generally available, can help migrate from the legacy publishing mode to the default publishing mode. Due to an issue found after release, the migration feature has been rolled back to Public Preview status and functionality.

See Enable the default publishing mode in a pipeline

AI agents: Authorize on-behalf-of-user Public Preview

September 2, 2025

AI agents deployed to Model Serving endpoints can use on-behalf-of-user authorization. This lets an agent act as the Databricks user who runs the query for added security and fine-grained access to sensitive data.

SQL Server connector supports SCD type 2

September 1, 2025

The Microsoft SQL Server connector in Lakeflow Connect now supports SCD type 2. This setting, known as history tracking or slowly changing dimensions (SCD), determines how to handle changes in your data over time. With history tracking off (SCD type 1), outdated records are overwritten as they're updated and deleted in the source. With history tracking on (SCD type 2), the connector maintains a history of those changes.