RSS

The Power of DevOps and Deploying Power BI reports to different Environments [Guest Post]

04 Nov
The Power of DevOps and Deploying Power BI reports to different Environments [Guest Post]

We are very happy to publish our very first Guest Post by Islam El-Ghazali (@islamtg)!

Mission Statement

We are trying to create a source control versioning process for our Power BI Reports, to allow us to publish our reports and revert to previous versions.

  • Eliminate the step of saving multiple Power BI Reports on the machine as v1, v2, …, etc.
  • Allow us to do continuous integration and development for the reports on multiple workspaces, i.e. DEV, QA, PRD with minimal effort.
  • Use Azure DevOps to make this a more automated process while have control over what gets pushed to prod.

The problem

There are some difficulties publishing a Power BI Report directly from Power BI Desktop, such as:

  • Power BI Desktop (Client Tool) needs to be installed in all the Environments (Production, Dev, QA Servers)
  • Without having proper Source or Version Control, the changes cannot be tracked.
  • The developer needs access to all the data sources across every environment. For example, if we have 2 environments (Dev and PROD) and the data source is SQL server, then the developer needs access to both Dev SQL Server and PROD SQL Server.
  • The developer needs to manually change the data source settings for every environment.

Proposed Architect

Best Practice

  • If you are using Cloud Data sources you don’t need a gateway, but for on-premises data sources, we should use the gateway. Another advantage of using the gateway for cloud data sources is that your credentials will be encrypted and stored in the gateway server and not with Power BI.
  • Try to reduce hardcoding your data sources in your reports (or), in other words, try to use Parameters for your data.

CI/CD Process for Power BI Reports

As soon as we say the automation (CI/CD) the things that would come to mind be like using API / Cmdlets. Likewise, we can use Power BI Rest API / PowerBI cmdlets for automating Power BI reports.

Maik van der Gaag created a great (Azure DevOps) extension called Power BI Actions, which makes things easier to handle CI/CD for Power BI.

Power BI Reports in Azure Repo 

The initial step here is to set up a source control version for our Power BI Reports (.pbix files), how to setup the Azure Repo as version control for our Power BI Reports. 

Note: You can also use Version control like Bitbucket, TFVC, GitHub, Subversion etc., instead of Azure Repo.

Steps:

  1. Create a new Azure Repo.
  2. Azure Build Pipeline (CI) for Power BI.
  3. Azure Release Pipeline (CD) for Power BI.

Create a new Azure Repo

We will need to create a new Azure Repo unless we have one that we are using we will need to specify a folder for PBI files. Then we will need to clone the repo locally and commit your pbix files to the Azure Repo, ideally use VS Code to make things easier but it is optional. Afterward, we will need to make a change on the file and commit the changes to Azure DevOps. Following that we will need to do push the commits. Once we pushed the commits, we will see our commit in the azure repo.

Azure Build Pipeline (CI) for Power BI

Now that we have a repo with the files all we will need to do is build the pipeline. We can build the pipeline in two ways, the first way and more flexible is by using YAML to write the steps and the jobs. The other way is to use the classic editor which will allow us to create this from the GUI.

GUI:

To setup the CI, we just need to include 2 tasks a Copy Task and a Publish Task in the Build pipeline.

Copy Task

Publish Task

YAML:

To setup the CI, we just need to write the yaml which will include 2 task a Copy Task and a Publish Task in the Build pipeline.

# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: CopyFiles@2
displayName: 'Copy Files to: Staging Artifact'
inputs:
SourceFolder: Reports
TargetFolder: '$(Build.ArtifactStagingDirectory)'
OverWrite: true
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: drop'

Azure Release Pipeline (CD) for Power BI

First, we will need to authenticate to the Power BI service we have two options.

Option 1 is to use a Master Account, this is basically using a username and password to authenticate to the Power BI service.

Option 2 is to use a ServicePrincipalName (appID/SPN), this is to register an App in Azure Active Directory to give it permissions for the Power BI API following that adding it to a security group and adding the security group to the Power BI service.

For more details – Embed Power BI content with service principal and an application secret

SPN Permissions are as follow:

For this I have used the SPN, and it would be the recommendation to use SPN for us.

Now, we go to DevOps and go to settings and service connections to add our service principal (SPN/AppId) to our DevOps instance and project so that we can use it in our release pipeline to authenticate to the Power BI service.

Using Power BI Action Extension

As a first step you need to install this extension from marketplace. From there we can use the publish power bi task and specify if we want to use a Master account or SPN, I use SPN. We then need to create a service connection for the SPN. We can do that from setting service connections or via the task there is an option to add a new connection. We will need to give the AppID and the secret.

We need to create a release, we will need to reference the artifact created in the previous steps whether it is through the GUI or YAML and then create our first stage, which should be DEV.

We can use variables to specify the root workspace for example “islamDevOps” then we can create a variable for that and reference the variable in each stage by adding a prefix _dev or _uat or _prod however the workspace name is.

In the Stages we need to add approvals, approvals will allow us  to approve the changes to the next stage that way if dev looks good we can go to the next stage if dev looks bad we disapprove and go back to the drawing board. This way it is good idea to have it because this will help from mistakes in prod.

Ideally, we would need to also add a parameter portion to change parameters between stages. This can be done again by the same task. Now we can also write in PowerShell and utilize the Power BI API to specify more sources.

The PowerShell script is as follows:

Install-Module -Name MicrosoftPowerBIMgmt.Profile -Verbose -Scope CurrentUser -Force

Install-Module -Name MicrosoftPowerBIMgmt.Workspaces -Verbose -Scope CurrentUser -Force

$datasetname="$(datasourceName)"

$workspacename="$(WorkSpaceName)_DEV"

$clientsec = "$(pbispnsecret)" | ConvertTo-SecureString -AsPlainText -Force

$credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $(Appid), $clientsec

Connect-PowerBIServiceAccount -ServicePrincipal -Credential $credential -TenantId $(tenent)

$workspace =Get-PowerBIWorkspace -Name $workspacename

$DatasetResponse=Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets" -Method Get | ConvertFrom-Json

$datasets = $DatasetResponse.value

     foreach($dataset in $datasets){

                if($dataset.name -eq $datasetname){

                $datasetid= $dataset.id;

                break;

                }

            }

$postParams = @{

            updateDetails =@(

            @{

            name="$(Paramatername)"

            newValue="$(datasourceValue)"

            }

            )

} | ConvertTo-Json

Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets/$($datasetid)/Default.TakeOver" -Method Post

$response=Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets/$($datasetid)/Default.UpdateParameters" -Method Post -Body $postParams | ConvertFrom-Json

The Yaml:

Power BI action

variables:
   WorkSpaceName: 'iselghaz-test'
 steps:
 task: maikvandergaag.maikvandergaag-power-bi-actions.PowerBIActions.PowerBIActions@4
 displayName: 'Power BI Action: Publishing PBI Report'
 inputs:
 AuthenticationType: ServicePrincipal
 serviceConnectionSP: 'PBI_SPN2'
 WorkspaceName: '$(WorkSpaceName)_DEV'
 PowerBIPath: '$(System.DefaultWorkingDirectory)/PBI/VSTSTasks/SampleFiles/SampleFiles/**.pbix' 

PowerShell Action

 variables:
   datasourceNameDev: 'test'
   WorkSpaceName: 'iselghaz-test'
   pbispnsecret: '8XwHaS_yO1wRMhj92_muK3Mi1sVU~-E63A'
   Appid: 'f972cf2a-51ca-42e5-a6d2-8883ccd633fe'
   tenent: '1d59ae34-96e8-4e70-ad82-21da0342d285'
   Paramatername: 'test'
   datasourceValue: 'https://m365x688545.crm.dynamics.com/api/data/v8.2'
 steps:
 powershell: |
 Install-Module -Name MicrosoftPowerBIMgmt.Profile -Verbose -Scope CurrentUser -Force
 Install-Module -Name MicrosoftPowerBIMgmt.Workspaces -Verbose -Scope CurrentUser -Force
 $datasetname="$(datasourceNameDev)"
 $workspacename="$(WorkSpaceName)_DEV"
 $clientsec = "$(pbispnsecret)" | ConvertTo-SecureString -AsPlainText -Force
 $credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $(Appid), $clientsec 
 Connect-PowerBIServiceAccount -ServicePrincipal -Credential $credential -TenantId $(tenent)
 $workspace =Get-PowerBIWorkspace -Name $workspacename
 $DatasetResponse=Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets" -Method Get | ConvertFrom-Json
 $datasets = $DatasetResponse.value
 foreach($dataset in $datasets){            if($dataset.name -eq $datasetname){            $datasetid= $dataset.id;            break;            }   }</code></pre>
$postParams = @{
            updateDetails =@(
            @{
            name="$(Paramatername)"
            newValue="$(datasourceValue)"
            }
            )
 } | ConvertTo-Json
 Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets/$($datasetid)/Default.TakeOver" -Method Post
 $response=Invoke-PowerBIRestMethod -Url "groups/$($workspace.id)/datasets/$($datasetid)/Default.UpdateParameters" -Method Post -Body $postParams | ConvertFrom-Json
 displayName: 'PowerShell Script: Changing PBI Paramater'

The release stage dev should now look something like:

Now we are ready from continuous deployment to other workspaces we simply need to create few variables for each workspace and map the correct variables to each workspace.

I would recommend that we give them meaningful names such as DatasourceNameDev, etc to distinguish which one belongs to which stage.

Limitations

  • Some minor limitations are that we are required to use the powerbi file if we want to deploy both the model and the visuals. So, we will need to have the pbix file rather than the pbit.
  • File size can get large.
  • PHI and PII data, we will probably recommend the usage of parameters.
  • You cannot see changes side by side on DevOps, you will need to use tool like PowerBI ALM to see side by side changes.
  • Secrets – currently keeping secrets as variables in DevOps then it is better to use KeyVault and pull from there.
  • Unable to use PowerBI template files for more security with sensitive data such as PHI/PII.

References:

Advertisement
 
10 Comments

Posted by on November 4, 2020 in power bi

 

Tags: , , , , , , ,

10 responses to “The Power of DevOps and Deploying Power BI reports to different Environments [Guest Post]

  1. whaley

    February 22, 2021 at 7:36 AM

    Hi, I am using Power BI action to upload a report, but I am getting ##[error]Operation returned an invalid status code ‘BadRequest’
    Any thoughts how I can fix?

    Like

     
  2. whaley

    February 22, 2021 at 7:37 AM

    Here is the log

    2021-02-22T12:30:50.2315811Z Starting Power BI Actions extension
    2021-02-22T12:30:50.2316540Z ### Required Module is needed. Importing now…
    2021-02-22T12:30:50.3810117Z ### Trying to import the incorporated module for PowerBI
    2021-02-22T12:30:50.5084436Z ##[warning]The names of some imported commands from the module ‘PowerBi’ include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
    2021-02-22T12:30:51.8666788Z Getting Power BI Workspace properties; BI – Dev
    2021-02-22T12:30:53.2646069Z Trying to publish PowerBI File: D:\a\r1\a\_DEV_PBI_CI\drop\a\Test.pbix
    2021-02-22T12:30:53.2669219Z Checking for existing Reports with the name: Test
    2021-02-22T12:30:54.0037229Z Done processing Power BI Actions
    2021-02-22T12:30:54.1228292Z ##[error]Operation returned an invalid status code ‘BadRequest’

    Like

     
    • Islam El-Ghazali

      February 25, 2021 at 10:41 AM

      when you publish a report from Power BI Desktop to Power BI Online service, it is split into separate report and dataset. They both have the same name (and are in the same workspace/group) but they have their own unique IDs.

      From power BI desktop when you re-publish the same (modified) report, you will get a message to confirm that the dataset will be overwritten. the dataset ID will remain the same. However, if you delete the report and dataset, and publish it again, these IDs will be different.

      Can you confirm that the dataset ID is the same or exists on that workspace?

      If you run the PowerShell outside of power bi can you share the error from the PowerShell terminal.

      Like

       
  3. Gaurav

    March 27, 2021 at 8:54 AM

    Hi, I am getting the same error.
    ##[error]Operation returned an invalid status code ‘BadRequest’

    Its a new report and into a new workspace.
    Can you help?

    Like

     
    • whaley

      March 27, 2021 at 11:28 PM

      Hi Gaurav,
      I fixed the “BadRequest” error by giving service principal admin access to the workspace. Then the upload task runs succesfully.
      Hope it helps.

      Liked by 1 person

       
  4. Islam El-Ghazali

    March 28, 2021 at 2:06 PM

    The pipeline does not check if the file exists or not, it assumes that there is an existing file in the workspace, So with this bug, I will update the steps to have steps 1 being a PowerShell that validates if the file exists in the workspace, if not publish it using the power-shell api if it already exists do step 2. this should take care of that particular issue for future readers!

    Thanks for the feedback and update Gaurav!

    Like

     
  5. Ravindra

    March 8, 2023 at 2:21 AM

    Hi, I am using yaml script to publish pbi report using “task : maikvandergaag.maikvandergaag-power-bi-actions.PowerBIActions.PowerBIActions@4” – is giving an error that “##[error]The current operating system is not capable of running this task. That typically means the task was written for Windows only. For example, written for Windows Desktop PowerShell.”

    Like

     
    • Islam El-Ghazali

      March 9, 2023 at 10:03 AM

      What agent are you using? There is PowerShell being used to run some of the Power BI administrative tools. you may need to check if you have a windows agent to run the PowerShell Snippets

      Like

       

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

 
%d bloggers like this: