Quickstart: Import a bacpac file to a database in Azure SQL Database or Azure SQL Managed Instance

Applies to: Azure SQL Database Azure SQL Managed Instance

You can import a SQL Server database into Azure SQL Database or SQL Managed Instance using a .bacpac file. You can import the data from a .bacpac file stored in Azure Blob storage (standard storage only) or from local storage in an on-premises location. To maximize import speed by providing more and faster resources, scale your database to a higher service tier and compute size during the import process. You can then scale down after the import is successful.

Use Azure portal

The Azure portal only supports creating a single database in Azure SQL Database and only from a .bacpac file stored in Azure Blob storage.

Warning

Bacpac files over 4GB generated from SqlPackage may fail to import from the Azure portal or Azure PowerShell with an error message which states File contains corrupted data.. This is a result of a known issue and the workaround is to use the SqlPackage command-line utility to import the bacpac file. For more information, see SqlPackage and the issue log.

To migrate a database into an Azure SQL Managed Instance from a bacpac file, use SQL Server Management Studio or SQLPackage, using the Azure portal or Azure PowerShell is not currently supported.

Note

Machines processing import/export requests submitted through the Azure portal or PowerShell need to store the bacpac file as well as temporary files generated by the Data-Tier Application Framework (DacFX). The disk space required varies significantly among databases with the same size and can require disk space up to three times the size of the database. Machines running the import/export request only have 450GB local disk space. As a result, some requests might fail with the error There is not enough space on the disk. In this case, the workaround is to run SqlPackage on a machine with enough local disk space. We encourage using SqlPackage to import/export databases larger than 150GB to avoid this issue.

  1. To import from a bacpac file into a new single database using the Azure portal, open the appropriate server page and then, on the toolbar, select Import database.

    Screenshot of the Azure portal, logical server overview page, with database import selected.

  2. Select Select backup. Choose the storage account hosting your database, and then select the bacpac file from which to import.

  3. Specify the new database size (usually the same as origin) and provide the destination SQL Server credentials. For a list of possible values for a new database in Azure SQL Database, see Create Database.

    Screenshot of the Azure portal, Database import page.

  4. Select OK.

  5. To monitor an import's progress, open the database's server page, and, under Settings, select Import/Export history. When successful, the import has a Completed status.

    Screenshot of the Azure portal, server overview page, showing the database import status.

  6. To verify the database is live on the server, select SQL databases and verify the new database is Online.

Use SqlPackage

To import a SQL Server database using the SqlPackage command-line utility, see import parameters and properties. You can download the latest SqlPackage for Windows, macOS, or Linux.

For scale and performance, we recommend using SqlPackage in most production environments rather than using the Azure portal. For a SQL Server Customer Advisory Team blog about migrating using BACPAC files, see migrating from SQL Server to Azure SQL Database using BACPAC Files.

The DTU based provisioning model supports select database max size values for each tier. When importing a database use one of these supported values.

The following SqlPackage command imports the AdventureWorks2008R2 database from local storage to a logical SQL server named mynewserver20170403. It creates a new database called myMigratedDatabase with a Premium service tier and a P6 Service Objective. Change these values as appropriate for your environment.

SqlPackage /a:import /tcs:"Data Source=<serverName>.database.chinacloudapi.cn;Initial Catalog=<migratedDatabase>;User Id=<userId>;Password=<password>" /sf:AdventureWorks2008R2.bacpac /p:DatabaseEdition=Premium /p:DatabaseServiceObjective=P6

Important

To connect to Azure SQL Database from behind a corporate firewall, the firewall must have port 1433 open. To connect to SQL Managed Instance, you must have a point-to-site connection or an express route connection.

As an alternative to username and password, you can use Microsoft Entra ID (formerly Azure Active Directory). Currently, the Import/Export service does not support Microsoft Entra ID authentication when MFA is required. Substitute the username and password parameters for /ua:true and /tid:"yourdomain.partner.onmschina.cn". This example shows how to import a database using SqlPackage with Microsoft Entra authentication:

SqlPackage /a:Import /sf:testExport.bacpac /tdn:NewDacFX /tsn:apptestserver.database.chinacloudapi.cn /ua:True /tid:"apptest.partner.onmschina.cn"

Azure Data Studio

Azure Data Studio is a free, open-source tool and is available for Windows, macOS, and Linux. The "SQL Server dacpac" extension provides a wizard interface to SqlPackage operations including export and import. For more information on installing and using the extension, see the SQL Server dacpac extension documentation.

Use PowerShell

Note

Azure SQL Managed Instance does not currently support migrating a database into an instance database from a bacpac file using Azure PowerShell. To import into a SQL managed instance, use SQL Server Management Studio or SQLPackage.

Note

The machines processing import/export requests submitted through portal or PowerShell need to store the bacpac file as well as temporary files generated by Data-Tier Application Framework (DacFX). The disk space required varies significantly among DBs with same size and can take up to three times of the database size. Machines running the import/export request only have 450GB local disk space. As result, some requests might fail with "There is not enough space on the disk" error. In this case, the workaround is to run SqlPackage on a machine with enough local disk space. When importing/exporting databases larger than 150GB, use SqlPackage to avoid this issue.

Important

The PowerShell Azure Resource Manager (RM) module is still supported, but all future development is for the Az.Sql module. The AzureRM module will continue to receive bug fixes until at least December 2020. The arguments for the commands in the Az module and in the AzureRm modules are substantially identical. For more about their compatibility, see Introducing the new Azure PowerShell Az module.

Use the New-AzSqlDatabaseImport cmdlet to submit an import database request to Azure. Depending on database size, the import might take some time to complete. The DTU based provisioning model supports select database max size values for each tier. When importing a database use one of these supported values.

$importRequest = New-AzSqlDatabaseImport -ResourceGroupName "<resourceGroupName>" `
    -ServerName "<serverName>" -DatabaseName "<databaseName>" `
    -DatabaseMaxSizeBytes "<databaseSizeInBytes>" -StorageKeyType "StorageAccessKey" `
    -StorageKey $(Get-AzStorageAccountKey `
        -ResourceGroupName "<resourceGroupName>" -StorageAccountName "<storageAccountName>").Value[0] `
        -StorageUri "https://myStorageAccount.blob.core.chinacloudapi.cn/importsample/sample.bacpac" `
        -Edition "Premium" -ServiceObjectiveName "P6" `
        -AdministratorLogin "<userId>" `
        -AdministratorLoginPassword $(ConvertTo-SecureString -String "<password>" -AsPlainText -Force)

You can use the Get-AzSqlDatabaseImportExportStatus cmdlet to check the import's progress. Running the cmdlet immediately after the request usually returns Status: InProgress. The import is complete when you see Status: Succeeded.

$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink

[Console]::Write("Importing")
while ($importStatus.Status -eq "InProgress") {
    $importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
    [Console]::Write(".")
    Start-Sleep -s 10
}

[Console]::WriteLine("")
$importStatus

Tip

For another script example, see Import a database from a BACPAC file.

Cancel the import request

Use the Database Operations - Cancel API or the Stop-AzSqlDatabaseActivity PowerShell command, as in the following example:

Stop-AzSqlDatabaseActivity -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName -OperationId $Operation.OperationId

Permissions required to cancel import

To cancel the import operation, you need to be a member of one of the following roles:

Compatibility level of the new database

  • The imported database's compatibility level is based on the source database's compatibility level.
  • After importing your database, you can choose to operate the database at its current compatibility level or at a higher level. For more information on the implications and options for operating a database at a specific compatibility level, see ALTER DATABASE Compatibility Level. See also ALTER DATABASE SCOPED CONFIGURATION for information about other database-level settings related to compatibility levels.

Limitations

  • Importing to a database in elastic pool isn't supported. You can import data into a single database and then move the database to an elastic pool.
  • Import Export Service does not work when Allow access to Azure services is set to OFF. However, you can work around the problem by manually running SqlPackage from an Azure VM, or performing the export directly in your code by using the DacFx API.
  • Import does not support specifying a backup storage redundancy while creating a new database and creates with the default geo-redundant backup storage redundancy. To work around, first create an empty database with desired backup storage redundancy using Azure portal or PowerShell and then import the bacpac into this empty database.
  • Storage behind a firewall is currently not supported.
  • During the import process, do not create a database with the same name. The import process creates a new database of the specified name.
  • Currently, the Import/Export service does not support Microsoft Entra ID authentication when MFA is required.
  • Import\Export services only support SQL authentication and Microsoft Entra ID. Import\Export is not compatible with Microsoft Identity application registration.

Additional tools

You can also use these wizards.