Seize the Data – Learning Metrics & Analytics
I’ve been a sporadic contributor to the SAP Community Network; recent posts including Managing a Global Talent Supply Chain and What Does the Future Hold for HR Professionals?. To tie together some of these themes, I want to spend some time writing over the summer on how workforce analytics bolsters efforts to measure the impact of activities across several specific talent management domains – Learning, Recruiting, Performance, and Succession.
Rather than simply providing my perspective, however, each post will also include an interview with a SuccessFactors subject matter expert close to the topic, as a way to tap into the conventional wisdom and future aspirations of colleagues who are leading the charge in communicating the value of data within their domains.
Below, I’ve started by looking at Learning Analytics, aimed at professionals in the learning/training field; subsequent posts will cover the three additional terrains mentioned above.
Finally, the title of the blog (“Seize the Data”) is not without meaning for workforce analytics – many organizations see “analytics as the future” but a lower technology priority than investments in core or transactional talent management systems. I would like to see HR leaders “seize the day” – holding a conversation today about how workforce analytics can deliver immediate business impact by leveraging data from across the talent enterprise.
1. What’s conventional practice for using data to measure the impact of learning?
According to Peter Howes and Ed Cohen who collaborated on a SuccessFactors white paper entitled Learning and Analytics (registration required), “the relationship between learning and analytics is often misunderstood and under-utilized. Typically, when people discuss analytics they are talking about reporting within a specific learning management system. They are talking about course completions, scores, pass rates and usage data.”
These points were echoed in speaking with two colleagues who are experts in the field of learning and who support SuccessFactors Learning – Andy Shean, Senior Solutions Consultant, and Nate Hurto, Vice President of Solution Consulting. Their view is that “reporting and analytics have always been important to Learning, however, are under-represented. This is especially true when looking at the impact of training on organizational KPIs”. For example, Andy and Nate agree that compliance reporting – listing which employees are in, or out of, compliance with a specific training item – is very powerful data for a shift supervisor who needs to roster staff according to who is qualified to operate a particular machine or follow a specific process.
Similarly, data on training utilization are also very worthwhile. For example, a SuccessFactors Workforce Analytics customer regularly publishes an automated national training scorecard – data includes % of employees trained, training hours completed, types of courses (blended, classroom, elearning), and reasons for course cancellation (the most common of which is a lack of participants).
2. What’s missing from this approach?
However, as noted by Andy and Nate, analytics showing the business impact of learning are limited in their adoption. They argue that there is a “clear opportunity to build on transactional reporting with analytics that utilize trend data from across the enterprise, including both input (learning) and output (business results) data. This demonstrates the value of training to the enterprise, allows for re-allocating scarce resources from low-impact to high-impact curricula, and, in cases where leaders seek to cut funding for staff development, ‘objectively justify the importance of the learning/training function’”. Peter and Ed put it a slightly different way, saying that LMS reporting “doesn’t really help you to make the connection with what is happening in your organization, or whether the training is having the desired effect of boosting productivity, improving service and raising levels of efficiency.”
Efforts to instill an analytics culture within learning have been mixed. For every organization publicly cited for excellence in learning analytics (Bersin’s 2012 Learning Leaders included AT&T, Cisco, and Grant Thornton), there are others that are just getting started, often with mixed, or troubling, results. For example, a senior HR manager at a public institution recently shared a story of how his HR leader brought to the executive table analysis of how the institution’s high-potential staff were more likely to receive promotions and pay increases after going through leadership training. This analysis was conducted without a control group (how did the trained high-potentials compare to high-potentials who didn’t attend the training?) or any measures of impact on the institution’s mission, financials, or people outcomes.
3. What might be examples of foundational metrics to apply to learning?
Of course, there are different approaches to moving beyond LMS reporting. One would just be to slice – by format, location, duration, participant, course status, certification awarded, or skill developed – your training/learning data to arrive at specific problem areas or insights that lend themselves to interventions – it is difficult to change results for the entire organization; better to start small and focus on problems requiring small fixes but with the possibility of high returns. Another would be to create output-based scorecards (as opposed to those with focus on inputs or throughputs to the learning process); examples of measures used by SuccessFactors Workforce Analytics customers include:
• Courses Supporting The Completion of Development Objectives
• Training Penetration Rate • Employee Satisfaction With Training
• Trained/Untrained Employees Average Performance Rating
• Training Investment per Employee
4. What about more advanced analytics?
The list above is still fairly Training Function-oriented, so a different approach is to stand back and consider what outcome/process change/decision we are seeking to better understand through the inclusion of learning data. For example:
• What type/frequency/format of training can provide the biggest boost to sales? Several years ago, we looked at sales rep training at one of our customers and found a $250,000 difference (in terms of annual sales) between reps who had taken all of the prescribed training modules for their role and those who had taken none of the training.
• Do development efforts pay-off in terms of building organizational flexibility & capability? Increasing learning agility, as discussed in this HR Executivearticle, would improve an employee’s ability to adapt to new situations and cultures (though the success of which might be difficult to measure).
• What business impact does social learning have? Many employees have access to corporate social networking tools, such as SAP Jam; firms are investing with the understanding that such applications drive productivity and reduce time-to-proficiency for new hires.
5. Any final words of advice?
Andy and Nate suggest three principles to guide your approach to learning metrics:
1. Integrate your learning data with other talent management data sources, especially performance, goals, and succession
2. Make the data easily available – if one of your goals is to create a culture of data-driven talent management decisions, confining learning metrics to those in the training department makes no sense. Share insights and encourage questions.
3. Get real results – Don’t stop with showing how leadership training helped your high-potentials get a promotion; be creative about working through a set of possible business outcomes – higher productivity, faster completion of goals, higher levels of product innovation, reduced costs, etc.
Stay tuned for future blogs that apply the same approach to other Talent Management domains.