Developing a set of evaluation tools for L&D that’s aligned to the needs of all stakeholders can be quite a task.John Mattox, Head of Talent and Research at Explorance in the US and co-author of the book: Learning Analytics: Measurement Innovations to Improve Employee Development, offers a few pointers to make the entire process easier:
1. Prioritise measurement efforts and use appropriate methods.
Every course is not created equal and stakeholders have different information needs. Prioritise how each course will be evaluated and what information will be gathered. There should be a core set of metrics for all courses and an extended set that meets specific needs across the curriculum.
Some courses only need a quick check on the content and technology platform (e.g., the 10-minute video about how to secure a laptop so it is not stolen from the workplace). Others, like a new curriculum to completely revamp the sales process, may need to measure post-training behaviors after 60 days and their impact on critical sales metrics.
It may be valuable to use a randomised experimental design or a naturally occurring comparison group to help determine causation and differential impact. A reasonable approach is to pursue impact studies (causation studies) for 5-10% of the curriculum, while also using an automated evaluation process to gather standard metrics for the remaining 90-95% of the curriculum.
As needed for important but not necessarily strategic courses, supplement the standard processes with additional data gathering approaches like interviews or focus groups with learners and their managers. In this way, richer data sets can be gathered to help make decisions about quality and to continuously improve L&D programs.
2. Use the appropriate statistical analysis.
Most learning analytics focus on simple descriptive information like averages and percentages. There are inferential techniques to help determine relationships (correlations, regression, structural equation modeling) and techniques to determine differences between groups (e.g. is the experienced sales team really performing worse than the new sales team members?).
Predictive analysis can also be used to determine which training factors lead to optimal learning, application and performance. Rather than suggest a technique to apply across all situations, it is more important to consult experts to suggest the best technique for each situation.
3. Use the information you gather.
Many measurement teams spend so much time and effort gathering, analysing and building reports that they are too exhausted to effectively communicate results to stakeholders. Results that are reported but not acted on are the ultimate sign of wasted measurement effort. Automate the standard, repeatable, error-prone processes (e.g. data collection, storage, coding, analysis and reporting) and then spend time and effort helping stakeholders review, understand, and act upon their results.
4. Create a short-term and multi-year measurement strategy.
Strategy begins with support from leadership. L&D and business leaders need to commit financial, technology, personnel and process resources in order for any strategy to get started. The strategy should address technology tools and expert personnel who can develop standard processes and communicate impact. Review the strategy yearly and adjust as needed.
5. Lastly and most importantly, align to the needs of stakeholders.
Measurement experts are masters of the ‘why’ and ‘how’ of measurement. They need guidance and input from their stakeholders to focus their efforts on the ‘what’. What should be measured? What are the business drivers? What information is needed to make decisions? Measurement without alignment is just OK measurement.
Interestingly, the results of our National Learning & Development Survey 2019 revealed only 15% of L&D professional share the necessary L&D analytics with the relevant stakeholders; only 20% indicate that their metrics are linked to business metrics and only 17% indicated that the L&D metrics are connected with business strategy.
John Mattox will be in South Africa for three days this November. He will present a two-day workshop on Learning Analytics in Johannesburg on 11 and 12 November 2019. This workshop will address the fundamental reasons for measuring the quality of training, as well as provide insights about how to create a scalable and sustainable measurement strategy that provides timely and valuable information to stakeholders about the quality of training programmes. You can learn more about the workshop programme – here.
Mattox, J. R. (2019, July 30). Beware OK Measurement Practices. LinkedIn.