Microsoft Azure Use Cases

Microsoft Azure Use Cases Average ratng: 6,7/10 8001 reviews

Deploy Microsoft Azure Stack on the system designed for cloud: Cisco UCS, purpose-built for rapid scalability and hybrid cloud workloads. Leverage Cisco UCS proven operational advantages: 40% faster infrastructure deployment and 38% reduction in ongoing management costs as compared to other Azure Stack systems. Common Azure Cosmos DB use cases Introduction. Azure Cosmos DB is Microsoft’s globally distributed database service. IoT and telematics. IoT use cases commonly share some patterns in how they ingest, process. Retail and marketing. Azure Cosmos DB is used extensively in Microsoft's own. Organizations that use both Azure and Azure Stack are able to leverage new hybrid use cases for internal line-of-business applications, as well as customer-facing apps. Typical use cases include edge and disconnected solutions, cloud apps that need to meet varied regulations, and the ability to leverage cloud apps on-premises. Microsoft Azure reviews have an overall customer reference rating of 4.7 from 4622 ratings. FeaturedCustomers has 648,219 validated customer references including reviews, case studies, success stories, customer stories, testimonials and customer videos that will help you make purchasing decisions.

  1. Microsoft Azure Stack Use Cases
  2. Azure Case Studies

Azure Redis Cache is a managed version of the popular open-source Redis data structure server; Azure Cosmos DB is a hosted NoSQL database for specific use cases; and Azure Search is an OData -based managed search service. Azure Media Services offers cloud-based video playing, indexing, transcoding. Peak Hybrid: The Top Four Use Cases for Azure Stack Consistent application development. Because Azure and Azure Stack use nearly the same code base. Security, data sovereignty and compliance. On-demand resources. Azure Stack enables you to rapidly provision and deploy compute. Data gravity.

-->

This article provides an overview of several common use cases for Azure Cosmos DB. The recommendations in this article serve as a starting point as you develop your application with Cosmos DB.

After reading this article, you'll be able to answer the following questions:

  • What are the common use cases for Azure Cosmos DB?
  • What are the benefits of using Azure Cosmos DB for retail applications?
  • What are the benefits of using Azure Cosmos DB as a data store for Internet of Things (IoT) systems?
  • What are the benefits of using Azure Cosmos DB for web and mobile applications?

Introduction

Azure Cosmos DB is Microsoft’s globally distributed database service. The service is designed to allow customers to elastically (and independently) scale throughput and storage across any number of geographical regions. Azure Cosmos DB is the first globally distributed database service in the market today to offer comprehensive service level agreements encompassing throughput, latency, availability, and consistency.

Azure Cosmos DB is a global distributed, multi-model database that is used in a wide range of applications and use cases. It is a good choice for any serverless application that needs low order-of-millisecond response times, and needs to scale rapidly and globally. It supports multiple data models (key-value, documents, graphs and columnar) and many APIs for data access including Azure Cosmos DB's API for MongoDB, SQL API, Gremlin API, and Tables API natively, and in an extensible manner.

The following are some attributes of Azure Cosmos DB that make it well-suited for high-performance applications with global ambition.

  • Azure Cosmos DB natively partitions your data for high availability and scalability. Azure Cosmos DB offers 99.99% guarantees for availability, throughput, low latency, and consistency on all single-region accounts and all multi-region accounts with relaxed consistency, and 99.999% read availability on all multi-region database accounts.
  • Azure Cosmos DB has SSD backed storage with low-latency order-of-millisecond response times.
  • Azure Cosmos DB's support for consistency levels like eventual, consistent prefix, session, and bounded-staleness allows for full flexibility and low cost-to-performance ratio. No database service offers as much flexibility as Azure Cosmos DB in levels consistency.
  • Azure Cosmos DB has a flexibles design lets you scale to massive request volumes in the order of trillions of requests per day.

These attributes are beneficial in web, mobile, gaming, and IoT applications that need low response times and need to handle massive amounts of reads and writes.

IoT and telematics

IoT use cases commonly share some patterns in how they ingest, process, and store data. First, these systems need to ingest bursts of data from device sensors of various locales. Next, these systems process and analyze streaming data to derive real-time insights. The data is then archived to cold storage for batch analytics. Microsoft Azure offers rich services that can be applied for IoT use cases including Azure Cosmos DB, Azure Event Hubs, Azure Stream Analytics, Azure Notification Hub, Azure Machine Learning, Azure HDInsight, and Power BI.

Bursts of data can be ingested by Azure Event Hubs as it offers high throughput data ingestion with low latency. Data ingested that needs to be processed for real-time insight can be funneled to Azure Stream Analytics for real-time analytics. Data can be loaded into Azure Cosmos DB for adhoc querying. Once the data is loaded into Azure Cosmos DB, the data is ready to be queried. In addition, new data and changes to existing data can be read on change feed. Change feed is a persistent, append only log that stores changes to Cosmos containers in sequential order. The all data or just changes to data in Azure Cosmos DB can be used as reference data as part of real-time analytics. In addition, data can further be refined and processed by connecting Azure Cosmos DB data to HDInsight for Pig, Hive, or Map/Reduce jobs. Refined data is then loaded back to Azure Cosmos DB for reporting.

For a sample IoT solution using Azure Cosmos DB, EventHubs and Storm, see the hdinsight-storm-examples repository on GitHub.

For more information on Azure offerings for IoT, see Create the Internet of Your Things.

Retail and marketing

Azure Cosmos DB is used extensively in Microsoft's own e-commerce platforms, that run the Windows Store and XBox Live. It is also used in the retail industry for storing catalog data and for event sourcing in order processing pipelines.

Catalog data usage scenarios involve storing and querying a set of attributes for entities such as people, places, and products. Some examples of catalog data are user accounts, product catalogs, IoT device registries, and bill of materials systems. Attributes for this data may vary and can change over time to fit application requirements.

Consider an example of a product catalog for an automotive parts supplier. Every part may have its own attributes in addition to the common attributes that all parts share. Furthermore, attributes for a specific part can change the following year when a new model is released. Azure Cosmos DB supports flexible schemas and hierarchical data, and thus it is well suited for storing product catalog data.

Azure Cosmos DB is often used for event sourcing to power event driven architectures using its change feed functionality. The change feed provides downstream microservices the ability to reliably and incrementally read inserts and updates (for example, order events) made to an Azure Cosmos DB. This functionality can be leveraged to provide a persistent event store as a message broker for state-changing events and drive order processing workflow between many microservices (which can be implemented as serverless Azure Functions).

In addition, data stored in Azure Cosmos DB can be integrated with HDInsight for big data analytics via Apache Spark jobs. For details on the Spark Connector for Azure Cosmos DB, see Run a Spark job with Cosmos DB and HDInsight.

Gaming

The database tier is a crucial component of gaming applications. Modern games perform graphical processing on mobile/console clients, but rely on the cloud to deliver customized and personalized content like in-game stats, social media integration, and high-score leaderboards. Games often require single-millisecond latencies for reads and writes to provide an engaging in-game experience. A game database needs to be fast and be able to handle massive spikes in request rates during new game launches and feature updates.

Azure Cosmos DB is used by games like The Walking Dead: No Man's Land by Next Games, and Halo 5: Guardians. Azure Cosmos DB provides the following benefits to game developers:

  • Azure Cosmos DB allows performance to be scaled up or down elastically. This allows games to handle updating profile and stats from dozens to millions of simultaneous gamers by making a single API call.
  • Azure Cosmos DB supports millisecond reads and writes to help avoid any lags during game play.
  • Azure Cosmos DB's automatic indexing allows for filtering against multiple different properties in real-time, for example, locate players by their internal player IDs, or their GameCenter, Facebook, Google IDs, or query based on player membership in a guild. This is possible without building complex indexing or sharding infrastructure.
  • Social features including in-game chat messages, player guild memberships, challenges completed, high-score leaderboards, and social graphs are easier to implement with a flexible schema.
  • Azure Cosmos DB as a managed platform-as-a-service (PaaS) required minimal setup and management work to allow for rapid iteration, and reduce time to market.

Web and mobile applications

Azure Cosmos DB is commonly used within web and mobile applications, and is well suited for modeling social interactions, integrating with third-party services, and for building rich personalized experiences. The Cosmos DB SDKs can be used build rich iOS and Android applications using the popular Xamarin framework.

Microsoft Azure Stack Use Cases

Social Applications

A common use case for Azure Cosmos DB is to store and query user generated content (UGC) for web, mobile, and social media applications. Some examples of UGC are chat sessions, tweets, blog posts, ratings, and comments. Often, the UGC in social media applications is a blend of free form text, properties, tags, and relationships that are not bounded by rigid structure. Content such as chats, comments, and posts can be stored in Cosmos DB without requiring transformations or complex object to relational mapping layers. Data properties can be added or modified easily to match requirements as developers iterate over the application code, thus promoting rapid development.

Applications that integrate with third-party social networks must respond to changing schemas from these networks. As data is automatically indexed by default in Cosmos DB, data is ready to be queried at any time. Hence, these applications have the flexibility to retrieve projections as per their respective needs.

Many of the social applications run at global scale and can exhibit unpredictable usage patterns. Flexibility in scaling the data store is essential as the application layer scales to match usage demand. You can scale out by adding additional data partitions under a Cosmos DB account. In addition, you can also create additional Cosmos DB accounts across multiple regions. For Cosmos DB service region availability, see Azure Regions.

Personalization

Nowadays, modern applications come with complex views and experiences. These are typically dynamic, catering to user preferences or moods and branding needs. Hence, applications need to be able to retrieve personalized settings effectively to render UI elements and experiences quickly.

JSON, a format supported by Cosmos DB, is an effective format to represent UI layout data as it is not only lightweight, but also can be easily interpreted by JavaScript. Cosmos DB offers tunable consistency levels that allow fast reads with low latency writes. Hence, storing UI layout data including personalized settings as JSON documents in Cosmos DB is an effective means to get this data across the wire.

Next steps

  • To get started with Azure Cosmos DB, follow our quick starts, which walk you through creating an account and getting started with Cosmos DB.

  • If you'd like to read more about customers using Azure Cosmos DB, see the customer case studies page.

-->

Azure Data Box Gateway is a cloud storage gateway device that resides on your premises and sends your image, media, and other data to Azure. This cloud storage gateway is a virtual machine provisioned in your hypervisor. You write data to this virtual device using the NFS and SMB protocols, which it then sends to Azure. This article provides you a detailed description of the scenarios where you can deploy this device.

Use Data Box Gateway for the following scenarios:

  • To continuously ingest massive amounts of data.
  • For cloud archival of data in a secure and efficient way.
  • For incremental data transfer over the network after the initial bulk transfer is done using Data Box.

Azure Case Studies

Each of these scenarios is described in detail in the subsequent sections.

Continuous data ingestion

One of the primary advantages of Data Box Gateway is the ability to continuously ingest data into the device to copy to the cloud, regardless of the data size.

As the data is written to the gateway device, the device uploads the data to Azure Storage. The device automatically manages storage by removing the files locally while retaining the metadata when it reaches a certain threshold. Keeping a local copy of the metadata enables the gateway device to only upload the changes when the file is updated. The data uploaded to your gateway device should be as per the guidelines in Data upload caveats.

As the device fills up with data, it starts throttling the ingress rate (as needed) to match the rate at which data is uploaded to the cloud. To monitor the continuous ingestion on the device, you use alerts. These alerts are raised once the throttling starts and are cleared once the throttling has stopped.

Cloud archival of data

Use Data Box Gateway when you want to retain your data for long term in the cloud. You can use the Archive tier of storage for long-term retention.

Archive tier is optimized to store rarely accessed data for at least 180 days. The Archive tier offers the lowest storage costs but has the highest access costs. For more information, go to Archive access tier.

Move data to Archive tier

Before you begin, make sure that you have a running Data Box Gateway device. Follow the steps detailed in Tutorial: Prepare to deploy Azure Data Box Gateway and keep advancing to the next tutorial until you have an operational device.

  • Use the Data Box Gateway device to upload data to Azure through the usual transfer procedure as described in Transfer data via Data Box Gateway.
  • After the data is uploaded, you will need to move it to Archive tier. You can set the blob tier in two ways: Azure PowerShell script or an Azure Storage Lifecycle Management policy.
    • If using Azure PowerShell, follow these steps to move the data to Archive tier.
    • If using Azure Lifecycle Management, follow these steps to move the data to Archive tier.
      • Register for the preview of Blob Lifecycle management service to use Archive tier.
      • Use the following policy to Archive data on ingest.
  • Once the blobs are marked as Archive, they can no longer be modified by the gateway unless they are moved to hot or cold tier. If the file is in the local storage, any changes made to the local copy (including deletes) are not uploaded to archive tier.
  • To read data in Archive storage, it must be rehydrated by changing the blob tier to hot or cool. Refreshing the share on the gateway does not rehydrate the blob.

For more information, learn more about how to Manage Azure Blob Storage Lifecycle.

Initial bulk transfer followed by incremental transfer

Use Data Box and Data Box Gateway together when you want to do a bulk upload of a large amount of data followed by incremental transfers. Use Data Box for the bulk transfer in an offline mode (initial seed) and Data Box Gateway for incremental transfers (ongoing feed) over the network.

Seed the data with Data Box

Follow these steps to copy the data to Data Box and upload to Azure Storage.

  1. Order your Data Box.

  2. Set up your Data Box.

  3. Copy data to Data Box via SMB.

  4. Return the Data Box, verify the data upload to Azure.

    Compliant with USB 1.1, Bluetooth 1.1 as well as 1.2 Specification.Uses 2.4 GHz ISM frequency band with Gaussian Regularity Change keying (GFSK) requirement.Sustains both Asynchronous Connectionless Web Link (ACL) and Simultaneous Connection Oriented (SCO). Trendnet bluetooth adapter driver. Enhance your Computers with Bluetooth peripherals. Unclutter your workspace with TRENDnets TBW-102UB Bluetooth wireless innovation.2.

  5. Once the data upload to Azure is complete, all the data should be in Azure storage containers. In the storage account for Data Box, go to the Blob (and File) container to make sure that all the data is copied. Make a note of the container name as you will use this name later. For instance, in the following screenshot, databox container will be used for the incremental transfer.

This bulk transfer completes the initial seeding phase.

Ongoing feed with Data Box Gateway

Follow these steps for ongoing ingestion by Data Box Gateway.

  1. Create a cloud share on Data Box Gateway. This share automatically uploads any data to the Azure Storage account. Go to Shares in your Data Box Gateway resource and click + Add share.

  2. Make sure this share maps to the container that contains the seeded data. For Select blob container, choose Use existing and browse to the container where the data from Data Box was transferred.

  3. After the share is created, refresh the share. This operation refreshes the on-premises share with the content from the Azure.

    When the share is synced, the Data Box Gateway will upload the incremental changes if the files were modified on the client.

Next steps

  • Review the Data Box Gateway system requirements.
  • Understand the Data Box Gateway limits.
  • Deploy Azure Data Box Gateway in Azure portal.