Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

A lot of ‘talk’ about Big-Data in the Utilities is happening today but not many organizations have started to do something about it yet. There have been pockets of innovations driven by some brave souls across the globe while a majority of them are still sitting on the side lines either watching these innovators if they would fail trying or achieve success, or are waiting for something to ‘break’ within their own organizations. Most Utilities conferences and events today invariably include discussions on Big-Data and Advanced Analytics related topics and there is a lot of enthusiasm and interest shown by the Utilities organizations that participate. However, more and more Utilities seem to be confused rather than being convinced about the paradigm shift required to deal with this unprecedented challenge faced by the industry.

Most organizations are still grappling with basic data management, reporting and analytical capabilities and a lot of critical business processes such as ‘forecasting’ are being run on excel spread-sheets with macros coded years ago by someone who has already retired and hence these tools have become sacrosanct that nobody dares to tinker with them. While some of these processes have largely remained unchanged the world that the Utilities knew existed has dramatically transformed. With the introduction of Smart Meters and Grid Automation there is new information and large data available to the Utilities which pose both a challenge and an opportunity to the organization. While most new age industries (such as internet based) have to deal with very rapidly changing data volumes, velocity and variety, Utilities have an added challenge because of the presence of ‘Dark Data’ – Data that is available since many years but is either forgotten, hard to access or difficult to analyse. Hence the Big Data challenge for the Utilities industry is different when compared to some of the other sectors.

The Utilities need to wake up to this reality and start doing something about it so that they can turn this challenge into a value delivery mechanism by milking the ‘Big-Data Cash Cow’. This is possible by creating a solid Big-Data Analytics Foundation for realising various advanced analytics use cases and a data driven business culture within the organization. To be able to effectively achieve this there are certain critical aspects that the Utilities need to consider which I would recommend as the ‘3P Formula’.

                      

Fig-1: The Big Data Cash Cow

This first ‘P’ in this formula is the People – it needs to be recognised that the kind of resources needed to address the Big-Data challenge are very different compared to dealing with typical Business Intelligence related problems. While a lot of the traditional IT skills such as BI Architects, Business Analysts, UI developers, DBA, etc.,  are still needed which could be available within the organization, there are a few unique and rare skills such as Data Scientists that are critical to the delivery of any tangible Big-Data solution. Apart from this it is essential to have involvement from the business and ensure that subject matter experts who know the business process are available and are part of the team. Any organization that is embarking on this journey of creating a Big-Data Analytics foundation has to ensure that this aspect is taken care of and all the required skills & resources are available either internally or from external partners. Another key element of the ‘People’ aspect is identification and involvement of the key stakeholders within the organization that will fund, own and drive this initiative forward.

The next aspect is the Process – Many organizations fail in successfully implementing large and strategic initiatives because of either absence or lack of proper processes. Hence it is very important to identify upfront the various process elements of this foundation required to ensure that the initiative and investments deliver on the objectives and goals envisaged. These processes include areas such as identifying the right use cases by using techniques such as ‘Design Thinking’, determining the business value of a use case, prioritization of various initiatives through a scoring mechanism, process for delivering the results, etc. These processes need to be documented, agreed, owned and communicated across the organization so that everyone that is involved with this initiative is aware and informed.    

The final and a key element of this formula is the Platform. This includes the choice of technology that can deliver the Big-Data capabilities required to
realise various business use cases. Many times the choice is driven by an organization’s enterprise wide IT strategy but a pragmatic decision needs to be
taken as a lot of technology platforms (such as SAP HANA) today have capabilities to integrate seamlessly with a myriad Information Technology and
Operational Technology systems.  It is also necessary to acquire the right kind of analytics tools (such as SAP Infinite Insight) and products (such as SAP BI) that are needed to deliver an end-to-end advanced analytics solution. Some of these tools may require specialist knowledge (such as science modelling and developing algorithms) and could need additional training that needs to be factored in.

Once this Big-Data Analytics Foundation is in place and it starts working like a well-oiled machine, you would have successfully created a ‘Cash Cow’ for which the fodder is your data. Make sure that your Big Data Cash Cow is well fed to be able to extract value from it. Happy milking !!!