Configuring Azure Log Analytics for Power BI

Many enterprise organizations utilize Power BI Premium capacity, often configured with numerous workspaces. Managing or governing these workspaces and their activities is challenging for tenants or Power BI admin. While Microsoft’s Fabric capacity metric report offers details on background or interactive operations at the report, dataset, or user level, there are instances where configuring Azure Log Analytics is necessary. Azure Log Analytics comprehensively monitors Power BI activities, helping administrators track performance, usage patterns, and system health.

What is Fabric Capacity Metrics?

Microsoft’s Fabric capacity metrics offer insights into our capacity consumption, aiding in decisions on resource utilization. These metrics provide details on utilization, throttling, and coverage at the report, dataset, and user levels. The compute page presents an overview of capacity performance, divided into three visuals: a ribbon chart a line and stacked column chart at the top, and a matrix table at the bottom. By clicking the ribbon chart we can understand the hourly view of our capacity usage and even drill down to a specific day to identify daily patterns.

The TimePoint details page shows the time range details of interactive operations, background operations, burns, and overages.

What is Azure Log Analytics?

Azure Log Analytics is a powerful tool within the Azure Monitor suite that collects and analyzes log data from various sources. By integrating with Power BI, it offers detailed insights into the system’s operations. Administrators can leverage Azure Log Analytics to create custom queries, visualize data, and set up alerts to proactively manage and optimize Power BI environments. This integration enhances the ability to identify performance bottlenecks, troubleshoot issues, and ensure efficient use of resources, ultimately leading to better governance and management of Power BI capacities.

The following image shows the power BI integration with Azure monitor.

Different scenarios to configure log analytics to Power BI

We can configure the workspace with Azure Log Analytics for the below scenarios

  • Logs from a single workspace to a dedicated log analytics workspace
  • Logs from different workspaces to separate dedicated log analytics workspace
  • Logs from multiple workspaces to a single dedicated log analytics workspace


We must consider the cost of configuring Azure log analytics with Power BI. There are currently two log ingestion plans: Basic logs and Analytic logs plans. The basic logs plan is sufficient for our configuration, and it costs $0.50 per GB and includes 8 days of interactive retention.

For detailed pricing information refer

How to configure Azure log analytics with Power BI

There are two elements required to get Azure Log Analytics working for Power BI:

  1. Create and Configure Azure Log Analytics workspace in the Azure portal
  2. Enable Log analytics for Power BI in the Power BI Admin portal

A. Create and Configure the Log Analytics workspace

  1. In the Azure portal, search for “Log Analytics workspaces,” click “Add,” select a subscription and resource group, enter a unique name for the workspace, choose a region, click “Review + Create” to review the settings, and then select “Create” to create the workspace with a default pay-as-you-go pricing tier.
  2. Register the ‘microsoft.insights’ resource provider in the Azure subscription where we collect Power BI log data.
    Sign in to the Azure portal, select the subscription that we want to use with Log Analytics and that contains our Log Analytics workspaces. In the Settings section, select Resource Providers as shown in the following image

Search for microsoft.insights under Resource providers. Then select Register

The user who sets up Log Analytics integration in Power BI must be in the Log Analytics Contributor role for the Log Analytics Workspace.
a) Select Access control (IAM) for the subscription in the Azure portal.
b) Select Role assignments from the top selections in the pane.
c) The user must see one entry as “Log Analytics Contributor” for the user who configures Log Analytics as below.

B. Enable Log analytics in the Power BI Admin portal

  1. In the Power BI Admin portal, go to Tenant Settings > Audit and usage settings, and expand Azure Log Analytics connections for workspace administrators.
    a) To allow workspace admins to enable Log Analytics, set the slider to Enabled and specify the needed security groups under Apply to, as shown in the following image

2. In the Premium workspace, workspace admins can enable Log Analytics.

  • Go to Settings as shown in the following image.
  • In the Settings pane, select Azure Connections, then expand Log Analytics.
  • Select Subscription, Resource group, and Log Analytics workspace from the respective drop-down and then click Save.
  • After completing the above steps, the expanded Tenant-level Log Analytics section should look like the following image.

Steps to analyze Power BI usage and performance in Log Analytics:

  1. Log in to the Azure portal. In the search bar at the top, type “Log Analytics workspaces” and select it from the dropdown list.
  2. Select the Log Analytics workspace configured to receive your Power BI logs.
  3. In the selected Log Analytics workspace, click on the “Logs” option in the left-hand menu. This opens the Log Analytics Query Editor.

With the above pictorial presentation, we can see the d1 and d3 in the two mappings are larger in size, but d2, d4, and d5 are smaller in size. During the execution, it takes more time to process for d1, and d3 compared to d2, d4, and d5. So, this leads to d2, d4, and d5 being ideal until, d1, and d3 complete their execution, enabling AQE to merge these partitions and make one partition by coalescing.

Here are some useful Kusto Query Language (KQL) queries for Power BI admins to explore various aspects of Power BI usage and performance in Log Analytics:

1.    Monitor Query Execution Times

Identify long-running queries to optimize performance.

let LongRunningThreshold = 10000;


| where EventType == “QueryExecution”

| where DurationMs > LongRunningThreshold

| project TimeGenerated, UserId, ReportName, DatasetName, DurationMs, Query

| order by DurationMs desc

2.    Track Dataset Refresh Status

Monitor the success and failure of dataset refreshes.


| where EventType == “Refresh”

| project TimeGenerated, DatasetName, RefreshType, RefreshStatus, UserId, DurationMs

| order by TimeGenerated desc

3.    Identify the Most Active Users

Determine which users are the most active in Power BI.


| summarize ActivityCount=count() by UserId

| order by ActivityCount desc

4.    Report Usage Patterns

Find out which reports are being accessed the most.


| where EventType == “ReportView”

| summarize ViewCount=count() by ReportName

| order by ViewCount desc

5.    Capacity Utilization Over Time

Analyze how the Power BI capacity is being utilized over a period.


| summarize avg(CPUPercentage) by bin(TimeGenerated, 1h)

| order by TimeGenerated desc

6.    Failed Query Analysis

Identify queries that have failed and analyze the reasons.


| where EventType == “QueryExecution” and Status == “Failed”

| project TimeGenerated, UserId, ReportName, DatasetName, DurationMs, ErrorCode, ErrorDescription

| order by TimeGenerated desc

7.    User Activity Over Time

Track user activity over time to understand peak usage periods.


| summarize ActivityCount=count() by bin(TimeGenerated, 1h), UserId

| order by TimeGenerated desc

8.    Dataset Refresh Durations

Analyze the duration of dataset refreshes to identify performance issues.


| where EventType == “Refresh”

| project TimeGenerated, DatasetName, DurationMs

| order by DurationMs desc

9.    Most Viewed Reports

Find out which reports are viewed the most frequently


| where EventType == “ReportView”

| summarize ViewCount=count() by ReportName

| order by ViewCount desc

10. Error Log Summary

Summarize errors to identify common issues and troubleshoot


| summarize ErrorCount=count() by ErrorCode, ErrorDescription

| order by ErrorCount desc

To use these queries, you need to have Power BI logging configured to send data to Azure Log Analytics. The table names (PowerBIDatasetUsage, PowerBIUsage, PowerBIRefresh, PowerBICapacityMetrics, and PowerBIErrorLogs) may vary depending on how the logs are set up in your environment. Adjust the table names as necessary to match your configuration.

How to disconnect Azure Log Analytics

We can disconnect from Azure Log Analytics to stop sending logs to Azure.

To disconnect, in the Power BI Workspace Settings > Log Analytics settings > Select Disconnect from Azure. Then click Save to disconnect.

Please note that when we disconnect a Power BI workspace from Azure Log Analytics, logs are not deleted. Our data remains and follows the storage and retention policies that we set there.


  1. Supported only on Premium workspaces.
  2. Only Workspace v2 supports Log Analytics connections.
  3. Azure Log Analytics doesn’t support tenant migration.
  4. Activities are only captured for semantic models physically hosted within the Premium workspace.
  5. Semantic models created on the web by uploading a CSV file don’t generate logs.
  6. For the Log Analytics feature, Power BI only sends data to the PowerBIDatasetsWorkspace table and doesn’t send data to the PowerBIDatasetsTenant table. This avoids storing duplicate data about log analytics in both locations


Integrating Azure Log Analytics with Power BI enables enhanced monitoring and analysis of Power BI activities and performance. It allows users to gather detailed logs and metrics from Power BI and visualize them within Power BI dashboards for better insights. This integration ultimately enhances the overall governance and efficiency of Power BI environments within an organization.

For More Details, Diggibyte Technologies Pvt Ltd has all the experts you need. Contact us Today to embed intelligence into your organization.

Author: Team PowerBI

Leave a Reply

Your email address will not be published. Required fields are marked *