Those of you who are just getting into the NoSQL world and are eager to get their hands dirty, there’s good news…
You’ll be glad to know that there is an open source solution called Data Migration Tool that allows you to import data from SQL Server into CosmosDB. Although, there are a host of other data sources that you can import data from, some of which are the following:
- Mongo DB
- Azure Table Storage
However, in this demo, I’ll show you how to import data from SQL Server into CosmosDB.
For the purposes of demonstration, I’ll be using AdventureWorks database which is a free download from Microsoft site.
After you finish downloading the Data Migration Tool, you’ll notice, there are two versions of executable’s. It depends whether you want the command line utility or prefer something more graphical, i.e. the GUI version. There are dt.exe and dtui.exe respectively.
I’ll be using the GUI version as it is easy to follow.
Following are the steps needed to accomplish this task. Fire up the Data Migration Tool via dtui.exe. See Fig. 1 below:
Now, let’s specify the “source” that you want to import from, which is presumably SQL Server in this demo.
Specify the connection string of SQL Server.
Let’s quickly flip over to SQL Management Studio and type the following:
SELECT * FROM [Purchasing].[vVendorWithAddresses]
Output from the query: Below is the output from the query above. Mind you, what you see below is just an FYI. We need to perform a little bit of ceremony here to alias the fields such that we can shape the data for document oriented CosmosDB.
This view (as you can see in Fig. 5 below) is made up of multiple tables with joins on Vendor, BusinessEntityAddress, Address, StateProvince, CountryRegion and AddressType.
I’ll massage the query a little bit and define the aliases so as to shape the JSON document in an hierarchical fashion.
If you want to try it out in SQL Management Studio, just copy the code and run the query below:
SELECT CAST(BusinessEntityID AS VARCHAR) AS ID, Name AS [Name], AddressType AS [address.AddressType], AddressLine1 AS [address.AddressLine1], AddressLine2 AS [address.AddressLine2], City AS [address.location.city], StateProvinceName AS [address.location.StateProvinceName], PostalCode AS [address.PostalCode], CountryRegionName AS [address.CountryRegionName] FROM [Purchasing].[vVendorWithAddresses]
Next, type a “.” in the Nesting Separator field. A separator “.” is used as delimiter to split column names into sub-documents. See Fig. 6 below:
Here, you’ll specify the target information in the following text fields:
Export to: Choose “DocumentDB – Sequential record import (partitioned collection)
Connection String: As previously mentioned
Collection: Here you can specify the name of your collection. In our case, its BlogTestStore
Id Field: Leave it blank as we have already defined an alias “Id” for BusinessEntityID field in our query
Number of Parallel Requests: Leave it at 10 as the default value
Enter Indexing Policy: Right click in the Text Area and select Range from the context menu. This is important.
After you have entered this information, click Next.
Leave the default values as is and click Next again.
The summary page will give you of a preview of all your selections. Review it quickly and click the Import button as the last step.
This action will trigger the Import process. As you can see, 104 records were successfully imported from SQL Server into CosmosDB with zero failures.
Azure Portal (Data Explorer)
Let’s quickly flip over the Azure portal and navigate to the Data Explorer tab. Here you’ll see the new Database with Collection and 104 documents created successfully.
That’s pretty much it folks! Hope you found this segment useful.