Skip to Content

Background

This is more of a reply to Steve Rumsby‘s last blog entitled Is self-service BI really a good thing? but I thought it made more sense as a blog than a really long comment.  So, go and read Steve’s thoughts and then head back on over here…

Just to set the scene, my main focus is not in the BI world, however I have worked on numerous BW/BI projects over the years; really my main angle here is around the more vague term of “reporting”, which I believe is something anyone who has spent any time working in IT will relate to.

My thoughts

For me it was quite fortuitous timing that Steve posted his thoughts around self service BI this week, as I have just finished the initial stage of an internal review of reporting for my employer.  I’d not specifically considered or heard the “self-service BI” moniker before however I think Steve has hit the nail on the head with where I think companies should be and some of the challenges they face.  There are a number of obvious limitations, as Steve has already alluded, around the quality of the “intelligence” generated and the underlying data.  I believe the challenges go further than that but with the rapid change in technology we are seeing in the SAP world, many of these challenges can at least be mitigated or even removed completely.  But at what cost?

We spend so much time having “big data” rammed down our throats and I’m sure we all work in organisations that have operational reporting solutions much more complex and unwieldy than anyone ever anticipated was necessary.  Yet, we all persist with this approach, building custom reports in SAP, BO data services, mobile reports and infinite data extracts to Excel, where the manipulation process starts afresh and further layers of reporting are added.  I sometimes wonder if big data is a self fulfilling prophecy, and how long will it be before the IT industry turns its attention to data diets (what us traditional folk call archiving?!)

What can we do?

Ask an operational manager if they need all 42 reports they receive each day and you can guarantee the answer will be a definite yes!


There’s a part of me that always wants to just turn off all reporting solutions and see who jumps up and down and demands they get their reports back – I’m a firm believer that it will be a good few weeks before anyone really notices… 😉   In the real world though this isn’t an option.  There has to be a better way.

Mashups and self-service BI as Steve has mentioned are obvious ways of removing some of the layers of reporting complexity and bringing the consumers of the data closer to the actual source.  Platforms such as HANA now offer the technological capability of churning massive amounts of data in real time – couple this with increasingly powerful and user friendly reporting solutions and then the world is your mollusc!

Accuracy & Efficiency – a marriage made in heaven?

Coming back to Steve’s point and questions around empowering decision makers, I see “good data” as the biggest obstacle to effective and efficient reporting.  We all know the old adage of rubbish in, rubbish out but this is absolutely paramount when important decisions are based on reports – that is the whole point of business intelligence.  The thing is, you cannot solve this problem with clever reporting, uber fast databases or fancy visual representations, you need more.

To reference and expand on Steve’s BBC anecdote, it doesn’t matter if all of the cod had RFID tags, were tracked in real time with HANA capturing geo-spatial data, and the report of their numbers was projected in 3d form onto the moon for all to see; ultimately the results would still be incorrect.

So, what am I getting at here?  There are two key elements I believe can greatly affect the results of reporting:

  1. Get your data correct.
    Steve has touched on this in his post already.  Its an obvious, no-brainer point on the surface and one that many of us will highlight on numerous occasions.  Going further, I’m not just talking about making it accurate, such as correctly measuring a cod as an adult.  I’m talking about making it complete, timely and accessible.  The SOA movement has gone a long way to try to help how we build data services in my opinion and the latest interest in REST based services in the SAP world adds further capability in this area.  Techies are able to deliver powerful and targeted data query and manipulation services that can be used almost anywhere.  But they must be absolutely correct in all ways.  That makes things more difficult!
  2. Get your processes correct.
    There are two streams to this – the processes of creating and maintaining your data (both the model and the content) and the processes of extracting, manipulating and creating reports.  I’m a big proponent of SAP BPM and believe it has a big part to play as the BI and big data world matures further.  Using the technology to help with the quality of data and then also deploying it to promote a management by exception approach to BI and reporting is, in my opinion, the way forward.  Lets do away with quantity or volume of reports and focus instead on the quality, in turn driving improved decision making and really delivering on the promise of BI.

Convergence of SAP tech.

I’m lucky to have an overview and understanding of many of SAP’s technical platforms and offerings, and am ever frustrated by how often it appears the left hand doesn’t know what the right hand is doing.  BPM follows the SOA paradigm, HANA & Gateway more ROA; ABAP uses the transport system, Java uses NWDI and now HANA uses ZGIT…  I could go on but we all have read about it a million times before.  Choice and complexity is good in its place but sometimes, just sometimes, it would be nice to see some consistency, and maybe some convergence of tech.

In the context of BI, I look to a future where HANA & BPM solutions straddle MDM strategies to ensure the data is good.  Clever reporting platforms bridge the gap between source data and target output, enabling users to generate their reports immediately, in the knowledge the source data is 100% accurate in all ways.  Taking this further, we lose the 42 reports each day and instead, clever use of BRM type solutions enables the decision makers to focus on the BI led decisions that they can and should be worried about, instead of filling their days compiling pivot tables and pie-charts for data that is 3 weeks out of date and completely inaccurate.

Am I on the right track, or just trying to send cod to the moon?

To report this post you need to login first.

5 Comments

You must be Logged on to comment or reply to a post.

  1. Edwin Vleeshouwers

    The direction seems to be okay too. I just fear that although everybody wants to get on that train, everybody will still want to design the train themselves.

    (0) 
    1. Kerry Dunn

      What about “Exception” reporting to avoid the 42 reports. Exception reports focus on attributes generally agreed to be of value. A very simple example – Accounts Receivable delinquencies report only late payments, not every single customer balance. Reporting objects must always be linked to corporate and operational objectives.

      (0) 
    2. Gareth Ryan Post author

      I think you are correct.  I also think this is where other SAP (and indeed none SAP) technologies and processes can help enable that self-building so that the quality is better*

      MDM, BPM, BRM, and a load of other TLA’s go a long way to help avoid potential mistakes.

      Of course, as has been pointed out on Twitter by https://twitter.com/fitzecarraldo76 this morning, no amount of tools and processes will ever accomodate straight forward human error:-

      *Notice I didn’t say completely correct?! 😉

      (0) 

Leave a Reply