Power Bi White Paper Adoption and Governance
You want to adopt Power BI?
Article written by Paul Brady, Principal Consultant for BI at Trustmarque.
Executive summary
You’ve seen the demos, the videos on YouTube, you’ve seen your competitors or related organisations produce wonderful, insightful, immersive content. Now it’s your turn!
Download the desktop, load up some data and you are done, publishing your content.
Easy. You are now “Doing Power BI”.
If only it was so simple.
Microsoft has built an incredible tool that is expanding at breakneck speed with monthly iterations including new visuals, new features, new connections and more functionality.
As a senior leader in a large organisation what should be on your checklist of things to consider when implementing Power BI or any technology that has the power to drive major organisational changes?
This whitepaper guides you through some of the key elements to making your adoption of Power BI a success.
Adoption roadmap
Creating your first piece of content or publishing your first report is only the start and it’s vital to remember that won’t have all you need in one place on day one. Having that first milestone does not mean you are done. It’s vital that your messaging to the organisation at this stage is “we are just at the beginning”, “long road ahead”. Your audience will have little understanding of what needs to be put in place to adopt, control and govern Power BI and in most cases it is a step by step process, and very few organisations have the patience or budget to put everything required in place on day one. It is therefore important to map out the stages and requirements of adopting Power BI. We refer to three stages,
- Pragmatism and speed
- A solid foundation
- Full integration
Each stage has the appropriate amount of resource and control relative to the stage on the roadmap. The pragmatism and speed stage often includes creating and publishing a specific piece of content to a specific audience for a specific purpose. The adoption requirements to do this are limited and can be done very quickly. The other end of the spectrum is full integration where everything is audited, data, datasets and content are shared internally, externally and fully controlled and documented. There are processes in place to prioritise updates vs new content and the Power Bi governance integrates seamlessly into the wider organisational controls. Setting out your ambition, use case by use case is the key to creating a successful roadmap. With each use case check off the following requirements:
Data – source, frequency, size method of connection
Roles – internally which roles are needed to develop and deploy content, gather and dictate best practice etc.
Documentation – Which processes need to be agreed and written up in order for the use case to succeed, e.g. integration to helpdesk, DevOps or starter/leavers process
Adoption opportunities
There are huge opportunities to transform your organisation by making better use of data through turning decision cycles from quarters to months to weeks to days to hours to real time. The automation inherent in Power BI frees hundreds of hours of manual effort. How will you use it? What will you target the team on next? Take the time to work with the subject matter experts to understand what opportunities arise from having better data quicker. For example, having live bed state on your mobile or a delivery schedule information in a meeting. Changing a performance cycle for a service from monthly retrospective to daily stand ups. What alerts do your team need to set up? Power Bi can not only pro-actively alert for KPIs above or below target but also statistical anomalies in process data or complex combinations of KPIS over time. In a large complex organisation with a deluge of data and information it can be set up to metaphorically tap a user on a shoulder to say “you told me this was important and I’m telling you that you need to look at it now”.
How much of your reporting is retrospective? How much time do you put into predicting the future vs reporting the past? Improve your analytics maturity by adopting forecasting techniques as well as just historical reporting. No-one has a crystal ball but the use of SPC, linear regression and other forecasting tools can be greatly beneficial. A simple linear regression on a numerical KPI can predict the trend in 3/6 or 9 months’ time. This separates the “always red” KPIS that everyone knows about form the currently green KPIs that will become red without intervention. Giving the subject matter experts insights into things they can do something about as apposed to just telling them the things they have done wrong already. Think beyond IT/informatics/analytics into the ”why” the data is collected in the first place.
Technical tools
This is one of the smaller challenges form an adoption point of view. Power BI has a variety of technical tools that assist with the adoption of Power BI. Each one is fully documented with very good resources from Microsoft and YouTube.
Tennant setting – Over 110 settings that allow or restrict access to Power Bi features and functionality. This is often the starting point for your documentation; recording the settings and the reasons why a particular setting was chosen. This is something you can return to and you then have an audit trail of decisions of who/why/when for changes to the tenant.
Access control – Every piece of content (report, dashboard, workspace, app, dataset, dataflow, datamart) can be individually controlled not just as to who can access it but also the level of access controls such as Admin, viewer, member etc., each with different levels of authority.
Row level/Object level security – Restrict what the end users can see when they log in so the same report will show differently based on who is looking at it e.g. one persons sees North district and another person is only assigned South district and the manager can see both North and south.
Data loss prevention policies – DLP and sensitivity labels are both part of the overall office 365 suite of controls and apply equally to Power BI
Power BI/O365 Audit – Almost every activity in Power BI is recorded in the audit logs and this data can be interrogated for one-off investigations or routinely monitored for maintenance needs
PowerShell – Use Powershell commands to move content, grant or remove permissions and audit Power BI data.
Power Bi Gateway– Understanding how it works and how to can be used to control data access will help.
There are many other controls as the use cases get more specialised and interaction with other data sources take place but mastering this list is a great start to understanding how to make your data secure in Power BI.
Democratisation of data – free for all?
“Hey I want to share a dataset with my partner organisation”. Technically, this is straightforward but what does it mean for governance? Let’s look at a starter checklist of what might be asked before undertaking such a task.
- Do we have a confidentiality agreement, a data sharing agreement and a specific DPIA covering this request?
- What is the data and does it contain any personally identifiable information?
- Who, internally is responsible for the data, who’s permission do we need to ask?
- If it is identifiable data do we need permission from the subject?
- What is the process for gathering the permission, where do we record that decision?
- Who will monitor the access and ensure it is up to date?
- How long will it be for and who will turn it off when complete?
- Who will access the data in the partner organisation and for what purposes?
This checklist can grow quite long but you get the idea. It’s the thinking around Power BI, the checks and balances that ensure that when a permission is granted, it is recorded and then appropriately revoked to ensure there is not an unbridled proliferation of data. It is a controlled roll out.
There are many helpful tools built into Power BI to assist sharing of content and data, those must be understood and controlled to have a solid governance process. Understanding the ownership/subject matter expert of the data (eg Dir of HR) vs the person who actually presses the buttons that grant access (eg helpdesk). Both of these parts are required to create a fully functional process to grant and revoke access to content.
Single version of the truth-lineages that matter
“Single version of the truth”. A phrase that has become almost a cliché. However, with strict control of the data from source through data warehousing it is possible to create genuine single versions of the truth that support multiple business outputs. This gives users confidence in the decisions made using supportive data. It eliminates multiple versions of numbers being surfaced from slightly different sources creating conflicting information. Having a fully audited data lineage from the data source, Power BI has fantastic tools to tell you the downstream impact of changes to any dataset and what it might impact further down the data chain. Again, ownership of specific data pillars where the analysts should be closely aligned with subject matter experts in those organisations’ pillars (HR, Finance etc.).
This creates a trust and a deep understanding where the analysts understand the processes and nuances of the organisation. This means all the appropriate inclusions, exclusions exception in the data are picked up and documented by the analysts in conjunction with the data warehousing staff. Keeping your analysts close with the organisation subject matter experts is key. It’s only possible to create a single version of the truth by documenting it in a standard and disciplined manner. That way, if a member of staff changes, knowledge of the data lineage doesn’t change with them. If the organisation undertakes a structural change (new divisions, new roles, new locations etc.) then doing an impact assessment on the existing data estate becomes straightforward (still tedious, but less so). Having the methodologies and standards for documenting this data lineage is vital for audit scrutiny and business continuity.
Centre of Excellence and integration into existing processes
What is a “Centre of Excellence”? It is a team, likely to be made up from existing staff that develop into defined roles to support specific elements of Power BI. For example, someone may specialise in the data engineering side of Power BI and become a Power Query Expert, someone else may invest more of their time in becoming a UI/design expert with the visualisations. In time, most members of the team will become subject matter experts for the business as a whole, however, until they get there it is helpful and a little less daunting if the tasks in Power BI are shared across the team. With everyone learning Power BI at a general level, it frees up staff so not everyone has to become a subject matter expert in every single aspect of Power BI straight away. Specialising in an area helps spread the load. Also, within the team there are responsibilities (assigned as roles) that need to undertaken, for example, the “Librarian”. This is someone who has a record of every single piece of content developed in the organisation. They also hold best practice scripts for things like date tables that can be universally applied to create high level standardisations across the organisation. On day one a librarian has little value but as the estate grows, they can save a huge amount of time by directing people to use existing dataflows and datasets and contributing to the single version of the truth. At Trustmarque we define about a dozen key roles (not FT jobs) that the Centre of Excellence need to adopt all of which will grow as the estate grows.
The Centre of Excellence doesn’t stand on it’s own. It is the technical arm of the governance structure. When Power BI is released into the organisation it is not uncommon for multiple senior staff to put pressure on the analytics team to develop whole suites of content for their pillars. It is very hard to say no. The answer is in two parts. The adoption of Power BI needs a senior management, director level sponsor to champion and provide air cover for the team. This is not enough, and we would normally suggest an operations group and a strategy group. The operations group meets weekly and contains subject matter experts from IT, Governance, Finance etc. This group deals with the roll out of the project as well as managing requests and content. They also do the legwork of creating and ratifying all processes related to Power BI. It reports to the Power BI steering group which is a director led group that sets the priorities for the ops group and ensures the implementation is providing maximum value to the organisation. This group has the final word on priority and investment. The Centre of Excellence will scope any request from a technical point of view, most decisions will be made by the Ops group but for those competing priorities it is the strategy group that makes the ultimate decision.
The final keystone in a successful adoption is integration into existing process. If you have a helpdesk, a PMO, a digital PMO, a transformation team, organisation restructure, major IT adoption like an EPR/CRM then make sure Power Bi adoption is integrated into those pre-existing processes. The integration might be as simple as sharing progress updates, or it could be joint meetings. Those pre-existing processes have routes to capital, routes for governance and access to use existing resources in the organisation. The last thing you want is to create yet another competing time grab on a shared resource such as budget or subject matter expertise. All the pre-existing processes and projects should all be aligned to organisational objectives and strategies, and by aligning the adoption of Power BI with these groups you become an enabling rather than a competing resource. Adopting Power BI in IT in hopes of “pushing it out” is an extremely difficult way of doing it with a very long adoption process.
Make integration with existing systems and processes a key part of the roadmap but don’t forget to put the operations and strategy groups in place. They need not be stand alone, the responsibilities for those groups could be adopted by pre-existing groups but the lineage of those responsibilities from board level strategy down should be very clear to all.
Do I need to replace my data warehouse?
If you are even asking that question then you already know the probable answer is yes. The more interesting question is when and with what? We’ve seen people spend years and a lot of money building the “perfect” data warehouse with the intention of using Power BI as just the visualisation layer. That is before even opening and using Power BI. Every CIO who started with “all transformations must take place in the data warehouse” have changed their minds once they find out what Power BI can do.
For those that have a poor data warehouse, it’s not ideal, nor best practice but it is pragmatism and speed to build your Power BI estate on what you have with Power BI taking up the slack and moving upstream to perform some of the validation, data quality and integration you would expect your data warehouse to do in a best practice environment. It really is based around your use case but the best advice is understand what Power BI can do before committing to a major data warehouse rebuild. It will most likely change what you intend to do or at the very least the timeframe and order in which you do it. Obviously, Power BI is built to work seamlessly with the whole suite of Azure data products and understanding those capabilities in the context of Power BI rather than just how do they compare to my current infrastructure is really important for such major expenditure.
What about Fabric?
What is Fabric? In the words of Bing Chat “Microsoft Fabric is a SaaS analytics platform that integrates various services and experiences from Power BI, Azure Data Factory, and Azure Synapse. It enables enterprises to perform data movement, data engineering, data science, real-time analytics, and business intelligence in a unified and easy-to-use environment. Microsoft Fabric simplifies analytics needs by providing a shared data lake, centralised administration and governance, and seamless integration across all components.” From the point of view of someone adopting Power Bi in September 2023 it is something to watch closely. Think of it as a wrapper around Power BI bringing some of the most powerful data science and engineering tools into the hands of analysts. It is not a replacement for a full enterprise data warehouse, but it does bring those tools to analysts.
Should you wait? Will it change your planned Power BI adoption? In short, no.
Everything you need to do for Power Bi is also something you need to do for the adoption of Fabric. Think of it as a bigger toy box of tools for the analysts. It has incredible power and opportunity, but it also has a learning curve with different prerequisites. The main stumbling block for adoption at this time is that it is in preview and no production workloads should be used at this time. There is no definitive date for GA yet. If you are just starting out in Power BI then tackle that first, there are many fantastic features in Fabric but for someone just starting out with Power BI it can be overwhelming enough without adding additional layers which may or may not be needed in the early stages of adoption. That advice, of course will be subject to change as the product develops. In all likelihood the Pragmatism and Speed phase of a Fabric adoption is likely to contain specific tools and tasks that make the Power BI adoption easier.
In Summary
Start today, start small and plan big. To adopt Power BI and it’s transformative opportunities, there should be corresponding governance workload as the size and complexity of your estate grows. By understanding what needs to be added to the roadmap when, it can be resourced as you progress along your analytics maturity and Power Bi adoption roadmap.