Azure Databricks regions

This article lists:

  • The regions supported by Azure Databricks.

  • Features available in each region, where there is regional differentiation in feature availability.

  • IP addresses and domains for Azure Databricks services and assets.

    You may need this information if your Azure Databricks workspace is deployed to your own virtual network (VNet) and you use custom routes, also known as user-defined routes (UDR), to manage network traffic using a virtual appliance or firewall.

Supported regions list

This table lists the Azure regions supported by Databricks. There are some features that are supported only in a subset of regions. The table indicates whether or not a region supports each of these features. If a feature is supported in all regions, it is not included in the table.

Note

Until recently, this table listed region support for Unity Catalog and Databricks SQL. Now that these features are supported in all listed regions, the table no longer mentions them.

Region Location Serverless SQL warehouses Model Serving Vector search Predictive optimization
chinaeast2 China East 2
chinaeast3 China East 3
chinanorth2 China North 2
chinanorth3 China North 3

IP addresses and domains

If you manage network traffic using a virtual appliance or firewall you might need the following information to ensure that network traffic is routed correctly for your workspace.

See User-defined route settings for Azure Databricks.

Databricks strongly recommends that you use the Azure Databricks service tag instead of specific IP addresses. Azure service tags represent a group of IP address prefixes from a given Azure service. The Azure Databricks service tag represents IP addresses for the required outbound connections to the Azure Databricks control plane, the secure cluster connectivity (SCC), and the Azure Databricks web application. Azure Databricks manages the address prefixes encompassed by the service tag and automatically updates the service tag as addresses change. This helps to prevent service outages due to IP changes and removes the need to periodically look up these IPs and update them.

Azure Databricks control plane addresses

The IP addresses you use to route network traffic depend on whether or not your Azure Databricks workspace uses secure cluster connectivity (SCC):

Most regions have multiple IP address ranges for the control plane IPs, Webapp, and NAT. This is because those regions contain more infrastructure services than others. Your workspace will be assigned to infrastructure services at one IP address for the control plane NAT and one for the Webapp during workspace creation. Your workspace will not be accessible by the infrastructure services at the other IP addresses, because data and secrets are not shared between infrastructure services within a region. There are therefore no security issues with having multiple IP addresses specified in your network security groups.

Inbound to Azure Databricks control plane

Azure Databricks Region Service Public IP or domain name
China East 2 Control Plane IPs, including webapp 52.130.1.64/32
SCC relay tunnel.chinaeast2.databricks.azure.cn
China East 3 Control Plane IPs, including webapp 52.130.1.64/32
SCC relay tunnel.chinaeast2.databricks.azure.cn
China North 2 Control Plane IPs, including webapp 52.130.16.113/32
SCC relay tunnel.chinanorth2.databricks.azure.cn
China North 3 Control Plane IPs, including webapp 52.130.16.113/32
SCC relay tunnel.chinanorth2.databricks.azure.cn

Outbound from Azure Databricks control plane

These values are used only if secure cluster connectivity is disabled.

Azure Databricks Region Service Public IP or domain name
China East 2 Control Plane NAT 52.130.1.65/32
China East 3 Control Plane NAT 52.130.1.65/32
China North 2 Control Plane NAT 52.130.16.112/32
China North 3 Control Plane NAT 52.130.16.112/32

DBFS root storage IP address

To get IP addresses for DBFS root storage:

  1. Go to the workspace instance in Azure portal.
  2. Click the workspace's managed resource group name.
  3. In the list of resources, find a storage account with the name in the format dbstorage************ and copy it.
  4. Get the endpoint domains, using the storage account name that you copied:
    • Domain <storage-account-name>.blob.core.chinacloudapi.cn. For example, dbstorage9875b57ac95c.blob.core.chinacloudapi.cn.
    • Domain <storage-account-name>.dfs.core.chinacloudapi.cn. For example, dbstorage9875b57ac95c.dfs.core.chinacloudapi.cn.
  5. Look up the IP addresses for these domain names.
  6. Create two UDRs to these IP addresses so that the UDRs route the traffic to the Azure Storage service.

Metastore, artifact Blob storage, system tables storage, log Blob storage, and Event Hub endpoint IP addresses

To get the workspace-level Hive metastore, artifact Blob storage, system tables storage, log Blob storage, and Event Hub IP addresses, you must use their domain names, provided in the following table, to look up the IP addresses.

Warning

Hive metastore, artifact Blob storage, log Blob storage, DBFS root Blob storage, and Event Hub endpoint IPs can change over time. To prevent a service outage due to IP changes, we suggest that you use Azure service tags in your route table. You can also establish a periodic job to look up these IPs automatically and keep them up to date in your route table.

Because metastore IP addresses can change over time, sometimes the same IP address is assigned to the primary and secondary metastores (for regions that have secondary metastores). In that case, you should include only one of the metastores in your route table.

Note

When using an external Hive metastore, make sure there are no existing DNS records for mysql.database.chinacloudapi.cn in any of the domain controllers or Private DNS Zones connected to the VNet associated with Azure Databricks. If there are DNS records, there must be an additional subdomain.

Azure Databricks Workspace Region Service FQDN Port Protocol
China East 2 Metastore consolidated-chinaeast2-prod-metastore-0.mysql.database.chinacloudapi.cn 3306 TCP
Artifact Blob storage primary dbartifactsprodcne2.blob.core.chinacloudapi.cn 443 HTTPS
Artifact Blob storage secondary dbartifactsprodcnn2.blob.core.chinacloudapi.cn 443 HTTPS
Log Blob storage dblogprodchinaeast2.blob.core.chinacloudapi.cn 443 HTTPS
Event Hub endpoint prod-chinaeast2-observabilityeventhubs.servicebus.chinacloudapi.cn 9093 TCP
China East 3 Metastore consolidated-chinaeast3-prod-metastore-0.mysql.database.chinacloudapi.cn 3306 TCP
Artifact Blob storage primary dbartifactsprodcne3.blob.core.chinacloudapi.cn 443 HTTPS
Artifact Blob storage secondary dbartifactsprodcne3.blob.core.chinacloudapi.cn 443 HTTPS
Log Blob storage dblogprodchinaeast3.blob.core.chinacloudapi.cn 443 HTTPS
Event Hub endpoint prod-chinaeast2-observabilityeventhubs.servicebus.chinacloudapi.cn 9093 TCP
China North 2 Metastore consolidated-chinanorth2-prod-metastore-0.mysql.database.chinacloudapi.cn 3306 TCP
Artifact Blob storage primary dbartifactsprodcnn2.blob.core.chinacloudapi.cn 443 HTTPS
Artifact Blob storage secondary dbartifactsprodcnn2.blob.core.chinacloudapi.cn (identical to primary) 443 HTTPS
Log Blob storage dblogprodchinanorth2.blob.core.chinacloudapi.cn 443 HTTPS
Event Hub endpoint prod-chinanorth2-observabilityeventhubs.servicebus.chinacloudapi.cn 9093 TCP
China North 3 Metastore consolidated-chinanorth3-prod-metastore-0.mysql.database.chinacloudapi.cn 3306 TCP
Artifact Blob storage primary dbartifactsprodcnn3.blob.core.chinacloudapi.cn 443 HTTPS
Artifact Blob storage secondary dbartifactsprodcnn3.blob.core.chinacloudapi.cn (identical to primary) 443 HTTPS
Log Blob storage dblogprodchinanorth3.blob.core.chinacloudapi.cn 443 HTTPS
Event Hub endpoint prod-chinanorth2-observabilityeventhubs.servicebus.chinacloudapi.cn 9093 TCP