26.02.2021

Versioning, approval stages and automated deployment of Power BI reports with Azure DevOps

In modern development processes automated CI/CD (Continuous Integration, Continuous Delivery) is implemented to provide the ability to develop and integrate small solution increments on a frequent basis. Automated development pipelines for Power BI can be provided by using Azure DevOps services. Learn how we implemented this approach with one of our customers!

Our customer, who operates in a highly regulated business, selected Power BI as reporting tool for his management reporting. The regulatory bodies require the documentation of not only every change to the underlying data model, but also in the data representation, which is used for decision making in the respective business departments. In audits, our customer also has to verify that decision relevant changes have been approved by authorized employees and proof sufficient fault-tolerance and consistency in data processing. As an additional requirement, the IT planned to provide self-service capabilities to allow business departments to implement reports independently, while IT provides a centrally managed data model. To comply with regulatory demands but also with business demand, we provided a solution which implements automation of CI/CD processes.

Solution Elements - Automated deployment of reports and data models

For the development processes of his management reporting, our customer faced several business and regulatory requirements regarding maintenance and improvement of reports and the data model. The development processes were not only substantial for a stable and efficient system, but also as verification during audits from regulatory bodies. Therefore, the architectural design must allow for:

  • Versioning of changes in the Power BI reports and the Power BI dataset (Version Control),
  • Error Handling to easily roll back earlier versions of either reports or the dataset,
  • a comprehensive approval process for publishing changes within the reports or the dataset, and
  • the tracking of changes based on the underlying requirements and demands.

Our solution leveraged the strengths of various tools:

PowerBI

Power BI is Microsoft’s tool to “Enable everyone at every level of your organization to make confident decisions using up-to-the-minute analytics.” As it is highly integrated with the Microsoft Power Platform and various Azure Services, it comes with a huge base of connectors. Power BI allows you to flexibly build the analytics applications of your choice. 

Source: https://powerbi.microsoft.com/en-us/ 

Azure DevOps

Microsoft Azure DevOps is a cloud service to “Plan smarter, collaborate better, and ship faster with a set of modern dev services.” Azure DevOps comprises different services such as Azure Boards, Azure Pipelines, Azure Repos, Azure Test Plans, Azure Artifacts as well as various extensions from a marketplace in a single productivity tool. 

Source: https://azure.microsoft.com/en-us/services/devops/ 

Power Shell and .NET

Microsoft Power Shell is ”a cross-platform task automation and configuration management framework, consisting of a command-line shell and scripting language. Unlike most shells, which accept and return text, PowerShell is built on top of the .NET Common Language Runtime (CLR), and accepts and returns .NET objects.”

Source: https://docs.microsoft.com/en-us/powershell/scripting/overview?view=powershell-7.1  

Similar as in software projects we utilized Azure DevOps for planning, creating, managing and distribution of code. For the automation of development processes and pipelines including approvals PowerShell could be utilized. While the self-service capabilities can be provided within  the Power BI service. The dataset was hosted in a distinct  workspace, while the reports resided in dedicated workspaces for the business departments utilizing a live connection to the dataset. 

Process design – Continuous development for Power BI reporting

As  the realization of the reports was conducted in an agile way, Azure DevOps Boards has been chosen to describe demands and requirements in a backlog. For each sprint, the items to be implemented were selected into the sprint backup and implemented in Power BI by the implementation team.

Integrate new features into the data model

Once changes to the data model are implemented by the IT department, the implementation team pushes changes to Azure Repos, where a version log will be kept automatically alongside with information regarding the requirements and demands causing these changes.  These version logs can be used for later audits, while the automated pipeline assures that approval processes are conducted accordingly, verifiable also via their version logs.

After the development of the new features is completed, the automated pipeline will start the approval and transportation process of the data model. Before the data model enters the test workspace, the approval must be given inside the Azure DevOps environment the by the responsible in the IT department. In the test workspace, the data model will be tested by the IT department with the productive reports provided for testing (Prod-T Workspace). If tests are successful, it will be released for approval for the productive deployment pipeline. These two pipelines were implemented with PowerShell scripts and YAML files in Azure Pipelines.

Integrate new features into reports

If changes are implemented in the reports by the business departments, they analogously push changes to Azure Repos, where a version log will be kept alongside with information regarding the requirements and demands causing these changes.

After a changed report has been saved in the repository, the automated pipeline starts the approval and deployment process to release the report for testing. Approval must be given inside the Azure DevOps environment by the responsible in the IT department. In the test workspace the report is tested by Key Users and if testing is finished successfully, the new report is released for approval to the production workspaces. If the changes are approved by the responsible in the IT department, the report is published to the production workspace for the report user and to the Prod-T workspace for testing data model changes with the latest versions of reports. These two pipelines were implemented with PowerShell scripts and YAML files in Azure Pipelines.
 

Key Takeaways

We had two important learnings from this project:

  • In a highly regulated business environment, version control, logging and approval processes are essential, especially when it comes to verifications during audits. As manual processes are time consuming and error prone, automation of such processes is important. This does not only include classical CI/CD pipelines for software development but also include data-intensive applications such as reporting.
  • Power BI built-in services do not prove sufficient functionality for our use case. Azure DevOps and PowerShell provide a rich toolset to implement automated deployment processes. Version control for codes and files were implemented with Azure Repos (former GitHub), while the automated pipelines including approval process were implemented with PowerShell and Azure Pipelines. 

When looking at single requirements, other options may arise as solutions, which might also come in with slimmer processes. Version control can be provided with SharePoint / OneDrive synchronization or a standalone repository. The Power BI Premium feature “Power BI Pipelines” provide the possibility of automating deployment process across multiple workspaces, which also comes with additional costs for the Premium node, we did not have at our deposal for the customer. These provide a suitable starting point to get used to development processes in your reporting from which you can build up further. We recommend having a close look on your requirements directly from the beginning and think long-term when Power BI reporting is becoming widely adopted within your organization. Furthermore, we recommend to align early with the IT stakeholder and make sure, that required expertise for support and maintenance of the dataset and the automated process is accessible for such a comprehensive architecture design. 

If this article was interesting for you, do not forget to check out our other articles regarding Smart Data & Solutions and subscribe to our RSS feed for updates for upcoming articles. For any questions, do not hesitate to get in touch with our experts in the field.

Authors

Ramón Roales-Welsch

Business Lead Data & AI

Bastian Kretschmer

Senior Consultant