If I have an on-premise installation of Dynamics 365 Business Central, can I move my database from a local SQL Server to Azure SQL by automating this task (no manual processing)?
I’ve received this question in a forum some days ago and I remember that I had previously created a script for this task in the past. Today I’ve found it on my repos and it’s the time to share it 🙂
Moving an on-premise Dynamics 365 Business Central database from SQL Server to Azure SQL can be done in the following ways (make sure the database does not contain users with Windows credentials. Only users with SQL authentication are allowed).
Way 1:
- Create an Azure SQL database via Azure portal and configure it (firewall rules in order to by accessible by your local machine)
- Open SQL Server Management Studio from your local machine, connect to your local SQL Server instance, right click your D365BC database and select Tasks|Export Data Tier Application.
- A wizard starts and from here you can save a .bacpac file
- Connect to your Azure SQL instance (directly from SSMS or via Azure Portal) and import the .bacpac file (from SSMS right click Databases and select Import Data-tier Application). You can also transfer the .bacpac file to an Azure Storage account and then import from here.
Way 2:
- Create an Azure SQL database via Azure portal and configure it (firewall rules in order to by accessible by your local machine)
- Open SQL Server Management Studio from your local machine, connect to your local SQL Server instance, right click your D365BC database and select Tasks|Deploy Database to Microsoft Azure SQL Database.
- The wizard prompts you to connect to your Azure SQL instance and deploys the database directly
Way 3:
- Create an Azure SQL database via Azure portal and configure it (firewall rules in order to by accessible by your local machine)
- Open SQL Server Management Studio from your local machine, connect to your local SQL Server instance, right click your D365BC database and select Tasks|Export Data Tier Application.
- A wizard starts and from here you can save a .bacpac file
- On your SQL Server machine, go to C:\Program Files\Microsoft SQL Server\<YourVersion>\DAC\bin and here you should have a program called sqlpackage.exe.
- Run sqlpackage.exe /Action:Import /tsn:YOURSERVERNAME /tdn:YOURDATABASENAME /tu:YOURUSER /tp:YOURPASSWORD /sf:YOURBACPACFILE , where:
- /Action is used to indicate if we are going to import, export, publish, and extract data.
- /tsn is used to define the Target Server Name.
- /tdn is the name of the new database to create on Azure SQL
- /tu is the Target User (Azure SQL admin user)
- /tp is the Target Password (Azure SQL admin password)
- /sf is the source file (path of your .bacpac file)
But how if we want to automate all these tasks? Powershell is our friend 🙂
For this script, we have to install and use the Azure Powershell module.
The script starts by declaring some variables (names are self explanatory):
$bacpacfilepath = "C:\Users\stefano\OneDrive\D365BC\BACPAC\d365bconprem190207.bacpac"; $bacpacname = "d365bconprem190207.bacpac"; $resourcegroup = "d365bc190207rg" $location = "West Europe" $servername = "d365bc190207srv" $serverversion = "12.0" $databasename = "d365bc190207"; $firewallrulename = "d365bc190207fwrule"; $localIP = 'XXX.XXX.XXX.XXX'; #Your local IP to trust on Azure SQL $subscription = "Microsoft Partner Network" $storageaccountname = "d365bc190207st" $storagecontainer = "dbcontainer";
Then we login to our Azure account:
#Login to Azure account Add-AzureRmAccount -Subscription $subscription
We then create a resource group that will contain all our resources needed for this new database creation:
#Create a resource group New-AzureRmResourceGroup -Name $resourcegroup -Location $location
Then we create an Azure SQL database server instance by choosing the admin credentials and we set a firewall rule inside this server in order to enable our local machine to access the Azure SQL instance:
#Create Azure SQL database server #Stores the Azure SQL credentials (specify your Azure SQL Server Login and Password) $credential = Get-Credential New-AzureRmSqlServer -ResourceGroupName $resourcegroup -Location $location -ServerName $servername -ServerVersion $serverversion -SqlAdministratorCredentials $credential #Firewall rules settings New-AzureRmSqlServerFirewallRule -ResourceGroupName $resourcegroup -ServerName $servername -FirewallRuleName $firewallrulename -StartIpAddress $localIP -EndIpAddress $localIP
We then create a Storage Account for uploading our .bacpac file:
#Create Storage Account Select-AzureSubscription -SubscriptionName $subscription New-AzureStorageAccount -StorageAccountName $storageaccountname -Location $location
We create the container inside the Storage Account with the right permissions (here I want that only the owner of the container can access it):
#Create a container in the Storage Account. Permissions off means that only the owner of the container has access to it. Set-AzureSubscription -CurrentStorageAccountName $storageaccountname -SubscriptionName $subscription New-AzureStorageContainer -Name $storagecontainer -Permission Off
Then we upload the .bacpac file to the Azure Storage Account container:
#Upload the bacpac file to the container Set-AzureStorageBlobContent -Container $storagecontainer -File $bacpacfilepath
The upload windows starts:
and when finished you can see the result:
After the .bacpac import into the storage container, we can create the Azure SQL database from this .bacpac file (for this we need some steps explained in the code):
#Import the bacpac to create a Database in Azure SQL Server: # 1) Retrieves the Azure Storage Key $primarykey=(Get-AzureStorageKey -StorageAccountName $storageaccountname).Primary #2) Retrieves the URI of the blob file $StorageUri=(Get-AzureStorageBlob -blob $bacpacname -Container $storagecontainer).ICloudBlob.uri.AbsoluteUri #3) Import the bacpac file on Azure SQL (we connect using a StorageAccessKey) $importRequest = New-AzureRmSqlDatabaseImport –ResourceGroupName $resourcegroup –ServerName $servername –DatabaseName $databasename –StorageKeytype "StorageAccessKey" –StorageKey $primarykey -StorageUri $StorageUri –AdministratorLogin $credential.UserName –AdministratorLoginPassword $credential.Password –Edition Standard –ServiceObjectiveName S0 -DatabaseMaxSizeBytes 500000
The import now starts. We periodically check the import status:
#Check the import status $importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink [Console]::Write("Importing") while ($importStatus.Status -eq "InProgress") { $importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink [Console]::Write(".") Start-Sleep -s 10 } [Console]::WriteLine("") $importStatus
The script continuously pulls the database creation status:
When the database creation is finished, the scripts ends:
That’s all. If you check your Azure Portal now you have your Azure SQL in-place and your Dynamics 365 Business Central database ready to go, all by launching a single script 🙂
The complete script can be downloaded from here.
Thanks for the great article.
If i export data to Azure SQL with ‘way 1’, is this a one time upload of data or does the azure sql database refresh with NAV database?
LikeLike
No, these are only 1-way methods (export of a database).
LikeLike