bacpac azure

Recently I had to create a backup procedure on my Windows Azure SQL Database.  SQL Server Management Studio allows us to access to our SQL Databases and provides the user interfaces for creating backup files. In this post we will understand: Since the database copy can be considered as a full backup, we want to create also a file in order to place it on another place, that could be a local or a cloud storage. First of all, we need to familiarize with DACPAC and BACPAC file concepts (aka Data-tier applications). Long story short, a DACPAC is a portable package that keeps data tier objects (SQL objects, credential included) and a BACPAC is a related file with the database schema and data. We have to create a BACPAC file in order to accomplish our goal, backup the Azure SQL Database. Then, we need a connection to SQL Databases with SSMS, as described in this section of the Windows Azure documentation. 1) Create a database copy T-sql will help us to create a database copy:
The command above MUST BE executed with master database context. USE statement is not allowed. If we try to execute it we’ll receive the following error: Msg 40508, Level 16, State 1, Line 1 USE statement is not supported to switch between databases. Use a new connection to connect to a different Database. Why we need a database copy? Because we want create a BACPAC file, and we need an isolated copy of the data. We can read the explanation in the Windows Azure documentation: “To make sure that you have an isolated copy of the data which is transactionally consistent, you must first create a database copy, and then create the BACPAC file from the copy. Alternatively, you can also prevent modifications to the data in the databases during export by limiting access to the databases through permissions or connections. Use the following steps to create a backup of the SQL Database.” That’s it, we are following the first suggestion. Taking a backup copy is like doing a full backup in our on-premise SQL Server database.
But in this post, we want to create the BACPAC file in order to store it on a different place. 2) Create a logical backup file (BACPAC) As we said before, our BACPAC file will contain the database schema and data. We can choose between t-sql or SSMS , let’s do it with the Management studio ui. Once we’ve created the copy of the Azure SQL Database (we called it DBSource_Copy in the statement above), press the right mouse button on it and select Export Data-tier Application.. A wizard will appear (second page, after the intro): We can choose the BACPAC file locally (useful if you want a backup on your own storage) or directly on the Windows Azure cloud storage. In order to save the file on the cloud storage you have to specify the credentials (storage account keys), pressing the Connect button: You can find your keys on the Windows Azure portal, selecting the storage you want to use and pressing the Manage Keys option on the bottom menu: After a short verification the connection to the storage will be established.
Now, we’ve to choose the container (blob) where we want to save the BACPAC file. If you don’t have any container you can create it directly from the Windows Azure portal, navigating the cloud storage –> CONTAINERS option: Follow the instruction, setting also the type of storage access (private in our example), and the container will be created in a short time. After this we can refresh the connection and choose the container to work with. The wizard displays also an Advanced tab with which we can include/exclude schemas/objects from our BACPAC file. backpack snow blowers calgaryFor a full backup, leave the selection as is (all objects selected):efco backpack sprayer After the summary window, we can execute the process and, after a while, we will receive the following report:backpack sunway pyramid
It worked, but in case of error the report will be somehow different.. however, the backup was made, it’s time to test it. Like the backup operation, the restore of our Azure SQL Database can be executed directly from SSMS. Right click on Databases folder and select  Import Data-tier application.. A similar wizard will pop-up, one of the differences is the File Name field when connecting to the cloud storage: As you can see it is possible to browse the cloud storage for getting the right BACPAC file.backpack tech n9ne The destination database will be set on the next window: The server name is shown on the first text box and the only thing we’ve to add is the name of the destination database (a new or an existing one). In addition we can change also the edition and the max size of the database. If you need further information about the SQL Database edition, read this link.
After the restore operation the database is available and we can use it. If you want these steps using the Azure Management Portal, you can find the Import/Export operations on the SQL Database section: Next to the two operations there is also an Add sync option with which you can synchronize On-Premise and Azure SQL Datbases or a complete Azure SQL Databases environment. If you need further information on this topic, follow this link.Note: As of the July 2013 update to Azure it’s now possible to schedule backups of Azure SQL databases in the Azure Management Portal. See this announcement post for more details Using Azure SQL Server hosted on Windows Azure is great option in many ways, but there is no built-in support provided for automating backups of your databases (unlike running SQL Server on an Azure virtual machine where you have access to the full features on SQL Server). This post describes a simple solution we put together in less than 2 days as an internal tool.
Windows Azure SQL Database service is a great option for your relational database needs, it gives you most of the features in SQL Server available as a service without you having to worry about managing SQL Server itself or the system it’s running on. Additionally it provides you with a 99.9% monthly SLA and the data stored in the database is replicated synchronously to three different servers within the same Azure data center which gives you a high degree of confidence in the durability of the stored data. SQL Server Management Studio 2012 provides built-in support for backing up a database to Azure Blob Storage which makes this operation very convenient: Figure 1 Exporting a database from SQL Server Management Studio Figure 2 Specifying the Azure Storage account and container for the backup Just specify the Azure Storage account you want to use to store the backup blob and SQL Server Management Studio takes care of the rest. This is very convenient for a quick, ad-hoc, backup but if you need to do this every night doing it manually gets cumbersome very quickly (even more so if the backup must be done at an inconvenient time).
As with any other computer-related task that needs to be done more than once, an obvious approach is to automate it so that we know it will be performed the same way every time. For this particular task there are several ways it could be done, but we wanted to also provide a simple UI to let non-developer users define backup jobs and view the results of backup jobs so a combination of an Azure Web Role to provide a management website and an Azure Worker role to do the actual backups fit our needs well. There are several ways to backup SQL Server databases, but in our case the Data-tier Application functionality in SQL Server met our needs. This functionality is available to .NET languages in the Microsoft.SqlServer.Dac assembly in the form of the Microsoft.SqlServer.Dac.DacServices class. This class makes it very easy to export a database with the schema and data to a BACPAC file: As you can see, all you need to provide is a connection string to the SQL Server and then the path of the BACPAC file the database should be exported to and the name of the database to export.
The DacServices class also provides access to the events that happen during the export so progress information can be captured and either displayed or logged. This works the same regardless of if you’re targeting an on-premises SQL Server database, the Azure SQL Database service or even a SQL Server hosted in an Azure Virtual Machine. It is, however, worth noting that you can only export to a local file, you can’t export to Azure Blob Storage directly. Once you have the BACPAC file, it’s easy to use the Azure SDK (in this case the 2.0 release) to upload it to Azure Blob Storage: Error checking and logging has been omitted from the above snippet, but these are essentially the steps needed. We built a very quick internal tool to wrap this with a simple management UI where you can manage backup jobs and view the logs of past jobs: Figure 3 The overall solution architecture The management UI runs in in a Web Role and accesses Azure Table Storage to get the list of configured backup jobs and the logs associated with those jobs.
The application uses the simple role and membership providers to ensure that only properly authenticated and authorized users can access the backup job information. The structure for storing the backup job definitions in Azure Table Storage is very simple: As you can see, there are some obvious issues with this scheme that would need to be addressed in a production solution: Given the partition key/row key approach you can only define one backup job per database The database connection string is currently stored in clear text, it should be encrypted to avoid revealing sensitive information (the same is true for the Azure Storage connection string) The LastJobId column is used to make it easy for the management UI to look up the logs from the last time the backup job was executed in the table that holds the logging data: The job id uses a simple approach of concatenating the SQL Server name, the database name and a time stamp. The backup worker runs as a Worker Role and performs most of the work with the overall cycle being: