Taking Inventory of Power BI Premium Workspaces

While Microsoft’s Power BI Premium Capacity Metrics app provides a wealth of information on the health of your Power BI Premium capacity datasets, queries and refreshes, one feature that I’ve found missing is just a list of reports hosted in premium workspaces. The following PowerShell script generates a list of workspace names and the reports and datasets in each workspace. Sample Output: 1 Executive Reports ….Reports …….. Executive Daily Report ….Datasets …….. Executive Daily Report […]

Taking Inventory of Azure Data Factory V1 with PowerShell

I recently had to take inventory of an Azure Data Factory V1 to help identify pipelines that were no longer required as well as to audit corporate naming conventions. Originally, the plan was to examine each ADF component in the portal. This is cumbersome and tedious. We could also examine the Visual Studio ADF project and review the JSON there. However, to quickly generate a list of Linked Services, Datasets and Pipelines, I used the […]

Quick Tip: Suspend/Resume All ADF V1 Pipelines

In Dev and QA environments, it is occasionally necessary to pause all Azure Data Factory V1 pipelines. PowerShell makes quick work of an otherwise manual task. Login-AzureRmAccount Select-AzureRmSubscription -SubscriptionId “PUT-Y0UR-SUBSCR1PT10N-1D-H3R3” $RGName = “ResourceGroupName” $DFName = “DataFactoryName” $pipelines = Get-AzureRmDataFactoryPipeline -DataFactoryName $DFName -ResourceGroupName $RGName foreach( $pl in $pipelines) { Suspend-AzureRmDataFactoryPipeline -DataFactoryName $DFName -ResourceGroupName $RGName -Name $pl.PipelineName #Resume-AzureRmDataFactoryPipeline -DataFactoryName $DFName -ResourceGroupName $RGName -Name $pl.PipelineName }

Create Data “Thumbnail” in Azure Data Lake Store

Recently, a colleague could not create a model in Power BI Desktop because the dataset files were too large. The smallest file was 45GB. He asked if we could rerun the Azure Data Lake Analytics job (U-SQL) to create a smaller dataset. We could, however one of the jobs that creates the datasets runs for about four hours. The solution I came up with was to create a dataset “thumbnail” which would provide a small […]

Finding Column Definitions in Oracle, SQL Server and DB2

In order to speed up the development of Azure Data Factory pipelines, I created a table containing definitions of various databases tables. Each of the big three databases provide a method for querying table and column definitions. SQL Server select table_catalog, table_schema, table_name, column_name, ordinal_position, data_type, coalesce(character_maximum_length,numeric_precision) as column_length from information_schema.columns Oracle select ‘myOracleDatabase’ as databasename, owner, table_name, column_name, column_id, data_type, data_length from all_tab_columns DB2 select ‘myDB2Database’ as databasename, tabschema, tabname, colname, colno, typename, length […]