Common Pitfalls Seen in the Industry
Surveillance dashboard projects in E&P industry tend to suffer from all the typical project management issues such as cost, quality and schedule slip-ups. Industry is replete with examples with failure of seemingly simple operational dashboard implementation projects that became costly and wasteful undertakings.
We have presented an analysis of some of the common pitfalls impacting the implementation of surveillance dashboards. However, it does not mean that the importance of post implementation sustainability aspects of such projects is not recognized. Based on our experience, many of the post-implementation sustainability issues are caused by poor implementation.
Pitfall 1: Centrally-led “push” approach failing to create asset buy-in
Project Impact Areas: Organization buy-in
This is mainly applicable to centrally-led Digital Oil Field (DOF) programs that typically include surveillance dashboard projects as one of the program components. Such programs are driven by central corporate DOF team that is entrusted with defining DOF strategy, standards and implementing the same across business assets.
Different assets have different facilities configuration, operational philosophy and field organization structures. As a result their operational drivers, KPIs and use cases for a surveillance dashboard projects tend to be different. Adopting a one-size-fits-all approach from central team towards implementation of surveillance dashboard systems is met with resistance or skepticisms from the assets.
Also assets are often not clear about how a new solution introduced by the central team would fit into their environment and processes and view them with a “here comes one more” mindset.
Failure to recognize and reconcile these fundamental differences between corporate and asset drivers is known to impact uptake of centrally-led surveillance dashboard solutions.
Pitfall 2: Lack of right level of business participation
Project Impact Areas: Organization buy-in, scope, quality and schedule
Surveillance dashboard projects require continuous and deep involvement of business users throughout project cycle. Unfortunately many surveillance dashboard projects are seen as IT projects managed by the IT project manager and staff with limited understanding of business issues. Another reason cited for limited business involvement is lack of business resources. Either way this results in following ramifications for the projects:
- IT staff usually lack awareness of variations in field specific surveillance practices and associated organization structures. Any risks or implications originating from that are not factored into initial project planning leading to several changes in scope later on.
- IT staff do not always have the necessary influence or authority to manage business decisions regarding scope and schedule. As a result project teams often give into the business demands for frequent changes through excessive modifications in scope and system design.
- Project teams try to compensate for lack of business resources’ availability by hiring external SMEs. However these SMEs also lack understanding of day to day operational practices in the asset and internal organizational dynamics.
Pitfall 3: Adopting a big-bang approach to deliver large scope of work
Project Impact Areas: Scope, schedule, quality and budget
Surveillance dashboard projects often cater to multiple business processes. One common issue that we have seen is project teams trying to deliver a very large scope of work using traditional waterfall release strategy. In our view such approach is fraught with risks. Scope creep and bug laden delivery are common issues. Even if the project scope is managed aggressively throughout lifecycle, learning that comes with incremental delivery is lost. Many hidden issues are uncovered only at the end of the project. For example data quality issues are commonly seen in production surveillance dashboards. We have seen many dashboard projects where project team discovers bad data issues not known earlier. Likewise in many instances we have seen performance issues because of insufficient network bandwidth or badly tuned PI systems. Fixing these issues is a time consuming process, and leads to further project delays.
Pitfall 4: Failing to address data and IT infrastructure quality issues
Project Impact Areas: Organization buy-in and quality
Surveillance dashboard systems are heavily dependent on underlying IT and communication infrastructure. Given the large and geographically diverse user base, high usage frequency and large volumes of real/non-real-time data transmitted over the network, it is important to pay attention to performance aspects. Typically end-users’ performance expectations are very high and they expect a graph/chart to appear within 2-4 seconds of mouse-click.
Data quality is another important aspect as the key objective of a dashboard is to enable decision making based on data. It’s important for end-users to trust the data that they see on their screens.
Although poor performance and bad data quality are the most commonly cited reasons for failure of these dashboards, unfortunately none of these two factors are paid sufficient attention to. Such issues are known to impact the promised value of surveillance dashboards and if not addressed in time, end users lose trust in these dashboards and go back to their individual tools/spreadsheets.
Pitfall 5: Underestimation of skills required for project execution
Project Impact Areas: Quality, schedule and cost
Skills required to deliver surveillance dashboard projects are greatly underestimated.
Surveillance dashboard projects involve iterative cycles to arrive at final specification. Each iteration involves associated end-user interactions, documentation, business reviews and design modifications.
Reducing the number of iterations and shortening the time scale of each requires superior business analysis skills. Unfortunately, surveillance dashboard projects are often staffed with Business Analysts (BA) who do not have sufficient skills to perform in the dynamic and complex business environments associated with dashboard projects.
The role of technical architects is also very important. Very often wrong technical choices lead to performance issues resulting in end-users’ dissatisfaction.
Example: In one case, a project team designed a technical solution considering a few hundred wells the company had at the time of project; however the same solution could not scale-up effectively when company drilled a few hundred wells more and end-users faced a lot of performance issues. In another case, the project team selected a tool for processing real-time data streams against a set of algorithms. The tool worked fine with a few calculations. However in the later stages of the projects, more calculations were added to the scope and a lot of performance issues were faced with the tool.