People in meetingAt the Gartner BI and Analytics Summit in Barcelona this week, I found myself participating in an interesting debate on whether the growing army of business users, in this new age of pervasive business intelligence and analytics, should receive education and training in analytics or whether the tools should just be easier to use.

Gartner directly posed the question during a panel discussion – should we improve user skills rather than simplify the tools? The premise was that if analytics were easier to use (with features like guided discovery and intuitive, easy-to-use interfaces), users would need to know absolutely nothing before they apply  statistical packages, visualization templates, and predictive algorithms to uncover the gems of insight buried in the data. This  feels a little like we’re suggesting that even though we’re living in a world with a burgeoning surfeit of data, as long as users have the right analytics tools, the data will simply tell its own story.

Me –  I’m not so sure. I’ve always found that if I just wade into a plethora of  information without any prior hypotheses of the type of relationships and trends that I might find, it’s always difficult to separate out the useful data from the misleading or confusing data that’s simply noise. I guess I need to go through a process similar to classical market research: first, develop testable hypotheses through qualitative focus groups and face-to-face interviews to surface the issues, then collect and analyse quantitative data to measure which ones are true and important enough to act on.  The parallel in business is having a dialogue with colleagues and customers before diving into the data.

Is the Traditional Scientific Method Obsolete?

Chris Anderson challenged this accepted approach in an article he wrote for Wired back in 2008 called ‘The End of Theory: The Data Deluge Makes the Scientific Method Obsolete’. He suggested that in the petabyte world of big data, the traditional approach of hypothesize – model –  test is obsolete and “the new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world.”  Many people have criticized such optimism, using planetary weather as an example to point out that too much ‘noise’ in the data about planetary weather made it  unlikely that trends in global warming would have been uncovered without researchers having a prior hypothesis.

Seems to me that Anderson’s bold statement also denies us – the human – any role in analytics. While I don’t doubt that some data-driven predictions can succeed, I’m convinced that a questioning mind, a better knowledge of the math, and an appreciation of the common misconceptions that people typically make, will always result in better assessments and reduce some of the inherent risk that results from poor business decisions.

Analytical Education Still Relevant and Increasingly Vital

So the point I was trying to make, albeit in 140 characters on Twitter, is that although we do need to make tools easier to use,  we (vendors, organizations and, increasingly, our educational systems) also have a responsibility to ensure that anyone using analytic tools is provided with a general grounding in simple statistics and research methods.  These users can then develop a questioning approach to using data and, by default, analytics in order to support the business decisions they make in their working lives. This approach could entail:

  • Questioning the validity of the data they’re working with. For example, we know that cancer patients can be misdiagnosed.  Some who are told they have the disease don’t (i.e. a false positive), while others who are told they don’t have the disease in fact do, (i.e. a false negative). You can bet we encounter the same issue in business all the time and never even think about it!
  • Teaching about probability and significance so we appreciate that findings and predictions are generally range based rather than a single data point – and how this impacts results such as elections or sales wins which have binary (win all or lose all) outcomes.
  • Helping us to become ‘diligent skeptics’ when it comes to data. A colleague tells a great story about an experience in insurance when a business analyst from the special lines division merged some previously siloed data sets and found considerable crossover between customers taking out one or more of their policies (cover for pets, music instruments, extended warranties) and was adamant this was evidence of “significant customer loyalty”. The reality was the insurer was the market leader in special lines, all of which were sold under different brand names through different channels, and the crossover was inevitable rather than evidence that indicated the likely success of a concerted campaign of cross selling.
  • Showing students how best to present and visualize data so it’s easy to understand and can be quickly digested.

If we don’t focus on analytical education and training, we risk entering an age of pervasive analytics that could rapidly become dysfunctional as people act on insights that are beautifully presented but entirely misplaced.  Analytics, forecasting, and predictions will always have some degree of error. Everything we can do to minimize that will improve the decisions we make and benefit our businesses.

Even the black belts get it wrong occasionally. Nate Silver, the statistician who became famous for predicting the voting outcome of every state in the 2008 and 2013 US Presidential elections through the use of sophisticated statistical analysis, fouled up on predicting the result of the Superbowl last weekend. If experts can get it wrong, where does that leave the rest of us?

 

 

VN:F [1.9.22_1171]
Rating: 5.0/5 (1 vote cast)
A Question of Analytical Education Verses Analytical Simplification, 5.0 out of 5 based on 1 rating