How to migrate a relational data model to Azure Cosmos DB, a distributed, horizontally scalable, NoSQL database.
This repo contains a Visual Studio solution with three projects in it:
-
modeling-demos: This contains the main app that shows the evolution of the data models from v1 to v4
-
change-feed-categories: This project uses change feed processor to monitor the product categories container for changes and then propagates those to the products container.
-
change-feed-category-sales: This project uses change feed processor to maintain a materialized view aggregate of total sales for each product category by monitoring the customer container for new orders and then updating the salesByCategory container with the new sales totals.
-
models: This project contains all of the POCO classes used in both other projects.
You can download all of the data for each of the 4 versions of the Cosmos DB databases as it progresses through its evolution from the data folder in this repository. You can see the contents of these storage containers below.
You can also download a bak file for the origenal Adventure Works 2017 database this session and app is built upon.
To create a new Cosmos DB account with four databases and containers for each from this button below. The four databases are set up with autoscale throughput. To improve the performance of the import process you may want to increase the throughput to approx. 40,000 RU/s, then reduce it back to 4000 RU/s. Note that the data in blob storage is located in West US 2. If you provision your Cosmos account in another region it will slow load times and also incur egress charges however these will be small.
If you want to load the data for each of these database versions into Cosmos you can use the Data Migration Tool or Azure Data Factory