top of page

BP Digital Performance Manager

Improving efficiencies, driving collaboration and proactive decision making to help deliver on-time, cost efficient, & safer projects

home screen large.jpg

Overview

BP have taken the step forward to process automation for their large scale downstream projects by introducing the digital performance manager platform. It automates manual processes and helps provide proactive insights and governance to project planning and analysis tasks.

 

The uptake of the platform had not been as successful as they hoped, I entered the project two years after product release. Tasked to assess the current platform and help improve the overall adoption and frequency of use of the tool within BP projects.

Role

- Workshop Facilitation 

- User Research

- Digital Strategy

- UX / UI Design 

- User Testing

Duration 

3 months (Ongoing)

The challenge

This platform handles extremely important data thats needed for each discipline area on projects to record and analyse their current progress. BP have been trying to introduce this product to all projects however employees are set in their ways of working and do not see the value in using the tool for their reporting needs.

• Currently only 30% of the target BP community is actively using the platform

• There is a lack of consistency of how data and content is displayed, with no clear design patterns and application throughout each report area

• The platform is updated on a frequent basis which causes users to become frustrated and cautious about taking up DPM until they feel it is ready

• Lack of understanding and consideration given to the full user groups which has led to creating features that only satisfy a subsection of users

Current Platform

The platform is built within PowerBI and is made up of 14 report areas, it was clear each report had been built in silo by the development team and there were no design rules followed for visualisations, use of colour or report structure. The challenge the team had faced was that each project/SME they had worked with requested different features and that had been taken as a false representation of the full user community. 

UX Assessment 

I investigated the history of the platform and spoke to the team in detail to find out their thoughts on where we could improve and why certain decisions had been made. The business had set strict KPI's on the DPM product team to increase adoption figures by 25% in the next 6 months which was an ambitious target. I set in place a three stage plan to help meet short and longer term objectives from a strategic and experiential standpoint.

Stage 1

- Create in-depth representative personas

- User satisfaction survey 

- Project lifecycle mapping

- Information architecture analysis

 

Stage 2 

- Heuristic evaluation of the platform

- Design guide/library

- Redevelop cost reports area (Research & Testing Framework introduced)

Stage 3

- Product onboarding feature

- Digital training platform

- Mobile development

The Users

There was a lack of documentation on the users of DPM, one of the first tasks was to understand the user groups involved within BP downstream projects.  I analysed their main goals and  frustrations along with understanding other tools and processes they were using.  The goal is to replace these with a single automated solution, but it was clear it needed to be accurate, real time and reliable for users to make the change. 

User Satisfaction Survey

A survey was conducted in order to understand the broader communities current opinion of DPM and to record benchmark metrics that could be later compared to in order to track our progress through development. It allowed the community to give general feedback behind the 'Why' DPM is currently not meeting their expectations, this was important to receive this view to compare with product owners assumptions.

Responses

 112 

10

Disciplines 

20

Net Promoter Score

Satisfaction Rating

3.5

Useful

3.1

Intuitive

2.9

Efficiency

Thematic Analysis 

Users were asked "What would you like to see within DPM moving forward?" This allowed users to state their main expectations of what was missing, these five areas were clear from the community. The most consistent request was to improve the consistency between reports, to provide clear training and comms to the team and to improve performance of load times.

Training & Communication

Customisation

Consistency

Performance

Proactive 

Insights

Project Lifecycle Mapping

In order to understand the process holistically we mapped the current lifecycle of a BP project to understand the end to end touch points that users are experiencing. These projects can last from 3-6 years and it was clear to see there was a lack of adaptable process and platforms that would cater for the change in the project stages.

Heuristic Evaluation

Conducted to evaluate the platform to find usability improvements across the ten heuristic principles. It was clear that consistency & standards were the main area of focus, there was the opportunity to implement some quick wins around navigation patterns, colour language, core report templates. With further design patterns and rules to be amalgamated into the design guide.

Information Architecture Audit

The current platform architecture was complex, there were 14 report areas with multiple sub reports within these, overall totalling 92 main pages. I implemented a clear nomenclature for each section that was consistent and reformatted the main navigation items to ensure they were more intuitive. As well as moving the product updates from the main IA over to a designated training platform to reduce further repetition.

Design & Development Guide

There was a lack of design documentation and guidance for the product, due to being built within Microsoft PowerBI, previous designers had simply worked with developers within the tool. However it was clear that a set of underlaying rules and design standards needed to be collated for each development team to follow moving forward and retrospectively go back through each report to ensure the guide is applied throughout the product.

Research & Testing Framework

Introduced a UCD framework for future feature development for the product team, this combined valid and representative user research through interviews/surveys. Combined with task based usability testing which measured satisfaction, efficiency and effectiveness from the current experience to the future experience in order to quantitively measure our improvements.

• User Survey shared to gain a large qualitative and quantitive foundation of insight from the community 

• Exploratory user interviews conducted with a representation of three users for each user segment (15) to gain an in-depth understanding 

• Satisfaction ratings were taken on current features to gain a benchmark, and the visual appeal of the features were taken using the Microsoft desirability toolkit

Onboarding & Training 

There was a lack of guidance for first time users and therefore we integrated a short onboarding flow to help users understand the basics of using a PowerBI tool, and core functionalities within DPM such as filtering, highlighting, exporting data and bookmarks to name a few. This in turn would increase the session length of initial visits as users would not become overwhelmed by complex product nuances.

 

Training was currently a time consuming task to setup and run team training calls, large presentation packs and bi-weekly update videos recorded and posted to the community. The key here was to change the output format/storage and the frequency. To introduce a designated platform to hold the micro video based training content and further updates in a categorised format.

Onboarding & Training 

There was a lack of guidance for first time users and therefore I integrated a short onboarding flow to help users understand the basics of using a PowerBI tool, and core functionalities within DPM such as filtering, exporting data and bookmarks to name a few. This in turn has increased the session length of initial visits as users would not become overwhelmed by complex product interactions.

 

Training was currently a time consuming task, they would run team training calls, provide large presentation packs and bi-weekly update videos recorded and posted to the community. The key here was to change the output format/storage and the frequency. To introduce a designated platform to store the micro video based training content and further updates in a categorised format.

Design Implementation

Once the recommendations had been implemented into each report area, we created a second product launch to notify the community of the product updates. There was a significant uptake in usage of DPM over the following period,  12% over the first month (200+ users). This was followed by the second user satisfaction survey where users responded with improved ratings across each area measured.

Main takeaways

DPM is a challenging and complex product, that can provide extremely high value to BP employees, the challenge was to convince the community that the product would be improving and that we had been listening to their valid feedback.

Educating the product team and developers on design best practice using the design guide has helped improve consistency and quality of outputs. 

Important to continue to follow the user centred design approach and not let features be developed in silo without a holistic view of user groups and design patterns.

UX Metrics (Over a 3 month period)

- Average net promoter score increased by 60% to 35 

- Active users had increased by 12% after the first month of recommendations were implemented  

- Average session length had increased by 9% 

bottom of page