October 2025

These features and Azure Databricks platform improvements were released in October 2025.

Note

The release date and content listed below only corresponds to actual deployment of the Azure Public Cloud in most case.

It provide the evolution history of Azure Databricks service on Azure Public Cloud for your reference that may not be consistent with the actual deployment on Azure operated by 21Vianet.

Note

Releases are staged. Your Azure Databricks account might not be updated until a week or more after the initial release date.

Databricks Assistant agent mode can now use models served through Anthropic on Databricks

October 31, 2025

Databricks Assistant agent mode can now use models served through Anthropic on Databricks when partner-powered AI features are enabled. Anthropic on Databricks uses endpoints hosted by Databricks Inc. in AWS within the Databricks security perimeter.

Skip cells when running notebooks

October 31, 2025

You can now skip individual cells when running multiple cells in a notebook using the %skip magic command. Add %skip at the beginning of any cell you want to skip. See Run Databricks notebooks.

Improved notebook debugger experience

October 31, 2025

The Python notebook interactive debugger now supports multi-file debugging. You can set breakpoints and step into functions across multiple workspace files. The debugger automatically opens the file in a new tab when you step into it. This improvement makes it easier to debug code that spans multiple files in your workspace.

See Step into workspace files.

New compute policy form is now generally available

October 29, 2025

The new compute policy form is now generally available. The new form allows you to configure policy definitions using dropdown menus and other UI elements. This means admins can write policies without having to learn or reference the policy syntax.

See Create and manage compute policies and Compute policy reference.

Agent Bricks: Multi-Agent Supervisor now supports Unity Catalog functions and external MCP servers

October 29, 2025

Use Agent Bricks: Multi-Agent Supervisor to create a supervisor system that coordinates Genie Spaces, agent endpoints, and tools to work together to complete complex tasks across different, specialized domains. You can now provide the supervisor system tools like Unity Catalog functions and external MCP servers. The supervisor agent will delegate relevant tasks to those tools.

Feedback model deprecated for AI agents

October 29, 2025

The experimental feedback model for AI agents has been deprecated. Starting November 1, 2025, newly deployed agents won't include a feedback model. Upgrade to MLflow 3 and use the log_feedback API to collect assessments on agent traces. See Feedback model (deprecated) .

Request logs and assessment logs tables deprecated

October 29, 2025

The payload_request_logs and payload_assessment_logs tables are deprecated. Starting November 1, 2025, newly deployed agents won't have these tables. Starting November 15, 2025, existing tables won't be populated with new data. Upgrade to MLflow 3 for real-time tracing or use the provided views. See Agent inference tables: Request and assessment logs (deprecated) .

Databricks JDBC Driver 2.7.5

October 23, 2025

Databricks JDBC Driver (Simba) version 2.7.5 is now available with the following improvements:

New features

The connector now supports Kerberos with proxy connections. To enable Kerberos proxy, set UseProxy=1 and ProxyAuth=2. To set proxy details, use ProxyHost, ProxyPort, ProxyKrbRealm, ProxyKrbFQDN, and ProxyKrbService.

Resolved issues

  • Fixed an issue where the connector failed to run complex queries that contained ? characters in native mode.
  • Fixed intermittent failures in Unity Catalog volume ingestion caused by unexpected connector behavior.
  • Fixed an assertion error in getColumns when a table included a column of type Void or Variant and the java -ea flag was enabled.

Zerobus Ingest connector in Lakeflow Connect (Public Preview)

October 23, 2025

The Zerobus Ingest connector in Lakeflow Connect is in Public Preview. This connector enables record-by-record data ingestion directly into Delta tables using a gRPC API.

Column drop behavior updated

October 22, 2025

When you attempt to drop a column that has one or more governed tags assigned, the operation now fails. To drop a tagged column, you must first remove all governed tags from it. See Drop a column with governed tags.

Databricks Runtime 17.3 LTS is now is now GA

October 22, 2025

Databricks Runtime 17.3 LTS is now generally available. See Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS for Machine Learning.

Compatibility Mode (Public Preview)

October 21, 2025

Compatibility Mode is now in Public Preview. Compatibility Mode generates a read-only version of a Unity Catalog managed table, streaming table, or materialized view that is automatically synced with the original table. This enables external Delta Lake and Iceberg clients, such as Amazon Athena, Snowflake, and Azure Fabric to read your tables and views without sacrificing performance on Azure Databricks. You can configure how often your read-only versions are refreshed, up to near real-time.

Zstd is now the default compression for new managed tables

October 21, 2025

All newly created managed tables in Databricks Runtime 16.0 and above now use Zstandard (Zstd) compression by default instead of Snappy.

Existing tables continue to use their current compression codec. To change the compression codec for an existing table, set the delta.parquet.compression.codec table property. See Delta table properties reference.

Databricks Runtime maintenance updates (round 2)

October 21, 2025

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.

Unified runs list (Public Preview)

October 20, 2025

The unified runs list is in Public Preview. Monitor both job and pipeline runs in a single unified list.

See What changes are in the Unified Runs List preview?.

Dashboard tagging (Public Preview)

October 16, 2025

You can now add tags to dashboards and Genie spaces to improve organization across your workspace. Tags can be used for automation. For example, you can tag a dashboard as "Work in progress," and an overnight process can automatically retrieve all dashboards with that tag using the API and assign them to the temporary warehouse until they're tagged as "Certified." Search is not supported using dashboard tags.

See Manage dashboard tags.

Jobs can now be triggered on source table update

October 16, 2025

You may now create triggers for jobs to run when a source table is updated.

The SAP Business Data Cloud (BDC) Connector for Azure Databricks is generally available

October 15, 2025

The SAP BDC Connector enables secure, zero-copy data sharing between SAP BDC and a Unity Catalog-enabled Azure Databricks workspace. Access and analyze SAP BDC data on Azure Databricks, and share Azure Databricks data assets back to SAP BDC for unified analytics across both platforms.

See Share data between SAP Business Data Cloud (BDC) and Azure Databricks.

Create backfill job runs

October 14, 2025

Job backfills allow you to trigger job runs to backfill data from the past. This is useful for loading older data, or repairing data when there are failures in processing. For more details, see Backfill jobs.

Improved autoscaling behavior for Mosaic AI Model Serving

October 13, 2025

Autoscaling in Mosaic AI Model Serving has been tuned to ignore extremely brief traffic surges and instead respond only to sustained increases in load. This change prevents unnecessary provisioned concurrency scaling during momentary bursts and reduces serving costs without impacting performance or reliability.

Data Classification (Public Preview)

October 13, 2025

Databricks Data Classification is now in Public Preview and supports all catalog types, consolidates all classification results into a single system table, and a new UI to review and auto-tag classifications.

Context based ingress control (Beta)

October 9, 2025

Context-based ingress control is now in Beta. This enables account admins to set allow and deny rules that combine who is calling, from where they are calling, and what they can reach in Azure Databricks. Context-based ingress control ensures that only trusted combinations of identity, request type, and network source can reach your workspace. A single policy can govern multiple workspaces, ensuring consistent enforcement across your organization.

The billable usage table now records the performance mode of serverless jobs and pipelines

October 9, 2025

Billing logs now record the performance mode of serverless jobs and pipelines. The workload's performance mode is logged in the product_features.performance_target column and can include values of PERFORMANCE_OPTIMIZED, STANDARD, or null.

Databricks Runtime maintenance updates (round 1)

October 7, 2025

New maintenance updates are available for supported Databricks Runtime versions. These updates include bug fixes, security patches, and performance improvements. For details, see Databricks Runtime maintenance updates.

Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS ML are in Beta

October 6, 2025

Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS ML are now in Beta, powered by Apache Spark 4.0.0. The release includes new configuration options, improved error handling, and enhanced Spark Connect support.

See Databricks Runtime 17.3 LTS and Databricks Runtime 17.3 LTS for Machine Learning.

Partition metadata is generally available

October 6, 2025

You can now enable partition metadata logging, a partition discovery strategy for external tables registered to Unity Catalog. See Use partition metadata logging.

Delta Sharing recipients can apply row filters and column masks (GA)

October 6, 2025

Delta Sharing recipients can now apply their own row filters and columns masks on shared tables and shared foreign tables. However, Delta Sharing providers still cannot share data assets that have row-level security or column masks.

For details, see Apply row filters and column masks.

Certification status system tag is in Public Preview

October 6, 2025

You can now apply the system.certification_status governed tag to catalogs, schemas, tables, views, volumes, dashboards, registered models, and Genie Spaces to indicate whether a data asset is certified or deprecated. This improves governance, discoverability, and trust in analytics and AI workloads.

Prompt caching is now supported for Claude models

October 3, 2025

Prompt caching is now supported for Databricks-hosted Claude models. You can specify the cache_control parameter in your query requests to cache the following:

  • Thinking messages content in the messages.content array.
  • Images content blocks in the messages.content array.
  • Tool use, results and definitions in the tools array.

Notebook improvements

October 3, 2025

The following notebook improvements are now available:

  • The cell execution minimap now appears in the right margin of notebooks. Use the minimap to get a visual overview of your notebook's run status and quickly navigate between cells. See Cell execution minimap.

  • Use Databricks Assistant to help diagnose and fix environment errors, including library installation errors.

  • When reconnecting to serverless notebooks, sessions are automatically restored with the notebook's Python variables and Spark state. See Automated session restoration for serverless notebooks.

  • Pyspark authoring completion now supports agg, withColumns, withColumnsRenamed, and filter/where clauses.

  • Databricks now supports importing and exporting IPYNB notebooks up to 100 MB. Revision snapshot autosaving, manual saving, and cloning are supported for all notebooks up to 100 MB. See Notebook sizing.

  • When cloning and exporting notebooks, you can now choose whether to include cell outputs or not. See Manage notebook format.

Convert to Unity Catalog managed table from external table

October 2, 2025

The ALTER TABLE ... SET MANAGED command is now generally available. This command seamlessly converts Unity Catalog external tables to managed tables. It allows you to take full advantage of Unity Catalog managed table features, such as enhanced governance, reliability, and performance. See Convert an external table to a managed Unity Catalog table.

Git email identity configuration for Git folders

October 1, 2025

You can now specify a Git provider email address, separate from your username, when creating Git credentials for Databricks Git folders. This email is used as the Git author and committer identity for all commits made through Git folders, ensuring proper attribution in your Git provider and better integration with your Git account.

The email you provide becomes the GIT_AUTHOR_EMAIL and GIT_COMMITTER_EMAIL for commits, allowing Git providers to properly associate commits with your user account and display your profile information. If no email is specified, Databricks uses your Git username as the email address (legacy behavior).

See Git commit identity and email configuration.

Configure Azure virtual network service policies for storage access (Public Preview)

October 1, 2025

Use Azure virtual network service endpoint policies to filter outbound traffic from the classic compute plane, ensuring connections are only made to specific Azure Storage accounts. See Configure Azure virtual network service endpoint policies for storage access from classic compute.

New permissions for the Databricks GitHub App

October 1, 2025

If you own an Azure Databricks account with the Azure Databricks GitHub app installed, you may receive an email titled "Databricks is requesting updated permissions" from GitHub.

This is a legitimate request from Databricks. It asks you to approve a new permission that allows Azure Databricks to read your GitHub account email(s). Granting this permission will let Azure Databricks retrieve and save your primary GitHub account email to your Linked Git credential in Azure Databricks. In an upcoming feature, this will ensure that commits made from Azure Databricks are properly linked to your GitHub identity.

If you don't accept the new permission, your Linked Git credential will still authenticate with GitHub. However, future commits from this credential will not be associated with your GitHub account identity