Skip to Content
Author's profile photo Rory Shaffer

The Rallying Cry of “Big Data” with the USE CASE in Mind

Teams are often at a loss when it comes to the value that will come from their Asset Big Data project.  This doesn’t have to be the case if you know where you want to go in the first place.

How often do you walk into the airport and ask for a ticket anywhere ?  My point is that every journey begins with a destination in mind, an idea of what will be at the end.

In the world of analytics and big data projects this is overlooked in the frenzied efforts to wrestle the tech that surrounds today’s hype cycles.

So how to approach the value and seize the opportunities that are there for the taking ? The early Greeks knew that one must “Know Thyself” in order to gain higher levels of knowledge.  In analytics this translates to having the use case in mind and then you have a fighting chance to end up with the maximum value in the end.

I have found that the following graphic provides a systematic approach to unearthing the use case and inherently the value that keeps projects alive and end users insight / knowledge moving onto a higher plain.

Let’s break this graphic down step by step;

Consider Purpose and Scope –

What is the purpose of the Use case – is it to be a deep analysis of data that is largely historical?  Perhaps a predictive outcome using a myriad of data sources to bring the perspective up out of deep rat hole that often engineering solutions provide ?  Then consider who will be the users – engineers, analysts, managers – Is one presentation method appropriate or does the user community require role based presentations to be acceptable?

What is the scope – End to End (E2E) processes deliver the insight and the action that allows the user to make that final decision knowing all that the data can reveal – however that may be too much initially?  A subset of the process or a limited scope allows for a more iterations and learning cycles within a finite time frame.

Prioritize Asset Classes – 

If the breakers are the problem then by all means go for the breakers.  The choice of asset classes and then later the specific asset types are often a major decision for a project.  Teams don’t have to go away thinking its all or nothing – laundry list the classes and apply a common sense test at this stage, the ones that make the cut may not survive the next step.  Furthermore the point of this exercise is to set the stage for future deployments – everyone can have their turn!

Identify Data Sources –

SPOILER ALERT – Undoubtedly the data you have is working today and you are making decisions with it – so don’t get hung up on the “Gotta Clean Data First” downward spiral.

Consider the source systems that you are already using, but be open to including additional data when the correlation is strong to an element of the insight you are working towards in your Use Case.  How about non structured data from text ?  The work order history files contains the feedback from the field and describes in detail what was directly observed at the time of the activity.

What ever is determined to be the idea data to work with the next step is to consider the connectivity and security that comes with its use.  If the data you land on is a third party or public source scrubbing out uniquely identifiable is generally required.

Design Deploy Refine –

Personas define the users and their needs in the analytics – Engineers, Managers all have a different take on the Use Case and its realization.  Are they using the data in tables, maps, graphs ?  Will they be using this in conjunction with other systems in a end to end E2E process ?

First round deployment is often limited to the SMEs to allow the results to be checked for sanity against established methods and tribal knowledge – depending on your skill and a measure of luck – you may get to pass go or be sent back to …

Refinement – curve fitting, adjustments of factors in the algorithmic processes that will allow more accuracy in the results are key to gaining the confidence of the end user community.

Activate Monitor –

Now that you have taken the time to methodically work through this approach – Your end user community of personas are using the analytics and are happy with the results.  The ability to deliver subsequent asset classes or specific types is now established and funding should follow !

This is also a good time to examine system response as the good news spreads throughout the organization and more users on board !

 

For more Insight on Big Data for Asset Managers – take a look at my series of blogs that describe how SAP can help those of you that are “Data Rich and Insight Poor” <link>

 

 

 

Assigned Tags

      Be the first to leave a comment
      You must be Logged on to comment or reply to a post.