This is the second in a series of articles discussing the pros and cons of various strategies for migrating to the cloud. In this article we explore five steps for a successful migration at a high level. As an Azure Data Architect with nearly a decade of experience, I have seen countless clients with a desire to create a mirror image in the cloud of what is currently in the data center. Generally, this is […]
Category: Azure
So You Want to Migrate to the Cloud
This is the first in a series of articles discussing the pros and cons of various strategies for migrating to the cloud. In this article we explore “the Cloud” at a high level. As an Azure Data Architect with nearly a decade of experience, I have performed numerous migrations to Azure and helped clients find the right path to “the cloud.” In 2010, Gartner defined the Five Rs of Cloud Migration: Rehost, Refactor, Revise, Rebuild […]
IOPS in Azure SQL DB
I learned a bit of trivia regarding Azure SQL DB today. In the standard tier, IOPS are limited to 1-5 per DTU. That means at an S3, you get a max of 500 IOPS. DTU IOPS are counted by IO operations and not size. It doesn’t matter if you’re reading or writing a single row to a table with a single char(1) column or a single row to a table with 500 nvarchar(max) columns, 1 […]
Taking Inventory of Power BI Premium Workspaces
While Microsoft’s Power BI Premium Capacity Metrics app provides a wealth of information on the health of your Power BI Premium capacity datasets, queries and refreshes, one feature that I’ve found missing is just a list of reports hosted in premium workspaces. The following PowerShell script generates a list of workspace names and the reports and datasets in each workspace. Sample Output: 1 Executive Reports ….Reports …….. Executive Daily Report ….Datasets …….. Executive Daily Report […]
Taking Inventory of Azure Data Factory V1 with PowerShell
I recently had to take inventory of an Azure Data Factory V1 to help identify pipelines that were no longer required as well as to audit corporate naming conventions. Originally, the plan was to examine each ADF component in the portal. This is cumbersome and tedious. We could also examine the Visual Studio ADF project and review the JSON there. However, to quickly generate a list of Linked Services, Datasets and Pipelines, I used the […]