Casual users are, I suspect, the vast majority of active participants in an organisation’s analytics. They sit in meetings as slide deck after slide deck of graphs, tables, reports and Key Performance Indicators slip by, without any idea of the assumptions, or processing that may have been applied to the data. Graphs are cropped to hide ugly outliers, their data carefully framed for maximal impact. The data is only really there to support the (foregone?) conclusions of the presenter. No broader or deeper analysis is done: is there a trend in the data? does a different story emerge when the data is viewed across a historical context, or compared against other data-sets?
Often, even a cursory glance at the data shows there might be some interesting patterns and raises many questions, for which, reasonably, there’s little time to discuss in a meeting. And, so, sadly, the path to more insights or deeper understanding ends here. Worse, the analytical content is often just cut-and-pasted onto a Powerpoint slide, and so there’s little chance of looking at the analyses, or exploring the underlying data. This is their first obstacle.
Instead, it would be nice if you could, after the end of the presentation, get the data and understand it yourself, because there’s a link to it. And it’d be nicer still if when you clicked on that link, you got a browser window full of familiar controls, buttons and data sources. All of which you trust and are happy to use. You can now explore the data, if that’s what you want to do.
Exploring the data is all well and good, but, like a recipe, the process follows a typical path – the usual gathering data, preparing or cleaning the data, combining and processing the data, before finally rendering the report, the visualisation, the analysis.
So, it would help productivity, help time-to-insight and help avoid errors, to be able to that understands that sequence of activities, that workflow. So the employee can go back and re-visit the stages of the recipe: see what data sources informed it, understand how the data may have been processed. The recipe metaphor continues to be quite helpful – you might like to take a recipe and ‘adjust’ it a little. Adjust it to change the flavours a bit, or add a new ingredient (ok, yes, chocolate).
Extending the metaphor a little, these employees aren’t the head chefs, the professional cooks of the organisation. They are more like home cooking enthusiasts, complete with their ranges of proficiencies and proclivities. But what they have in common is that they can follow and adapt recipes and they have an idea of what roles various ingredients play. And so these users are comfortable with doing this, because they’re using trusted data sources for their ingredients, be they structured or unstructured data, and so they can decide whether adding new data, or new data sources will improve the end dish. They can also choose to employ different analyses to those previously applied, adapting the recipe, to arrive at a new dish. These two ‘every-day’ skills provide the means, when applied to analysis, to generate new and valuable insights, all as a result of something that piqued your interest in a meeting.
But – let’s be honest – they then like to share their new recipes with others. It’s in that sharing that the full value of their ideas, their data and their analyses can be realised. The sharing leads to avid, enthusiastic discussions that are the basis of collaboration. That’s one way in which Agile Analytics enables the 75% of an organisation’s people that currently don’t, can’t or haven’t taken advantage of the latent data and insight already available within the business. Come to SAP UKI’s Innovation Forum event, 4th June, in London, to find out more:
I’d like to thank Pete Humble, an SAP BI guru, for his insightful contributions to this blog.