A strategic partner to the business is able to prove the effectiveness and efficiency of their organization. Today, we measure time-in-training, completion rates and test scores. Unfortunately, these don’t tell you if the training you are providing is actually working. The business wants more. They want low-level, actionable metrics. This can be accomplished when we take a Google analytics approach to measuring learning.
What if we had a set of learning analytics that could be plugged into the corporate scorecard? And, what if we could take immediate action on these analytics to make course corrections at the speed of business?
In her session at the ASTD 2014 Conference, Xyleme VP of Marketing Dawn Poulos, addressed these questions through a series of real-world use cases that showed learning data aligned to functional performance indicators. Attendees left the session with the analytical tools they need to prove the ROI of their training initiatives.
The Great American Payday Prepare for a (Relatively) Bumpy Ride.pdf
The Google Analytics Approach to Measuring Learning: Allowing HR to move at the speed of business
1. The Google Analytics Approach
to Measuring Learning
Allowing HR to move at the speed of business
Dawn Poulos
Vice President of Marketing
Xyleme, Inc.
Now let’s talk about how we currently measure learning. I’m going to go ahead and launch another poll. If you choose ‘other’ use the chat box to elaborate.Let’s go through each of these and the questions we need to ask ourselves:Do these metrics talk the language of business?Let’s look at the LMS dataAs you saw on the previous slide, LMS metrics are proprietary to the LMS you are using and can only measure formal learning. The problem is that corporate does not want to hear anymore about how many hours or days you’ve spent in training because it doesn’t tell them if the training is efficient or effective.Completion rates don’t help because there are a lot of smart people in companies that never complete courses. Let’s look at MOOCsMOOCs are all about very, very low completion rates, but lots of people learn from them –why?Because they go in, find what they need, say goodbye and move on and go apply it.It has nothing to do with whether a course was good or not, so poor data leads to poor decisions. What about Level 1 evaluations (smile sheets) or Level 2 assessments: Can an ID take that information and act on it?E.g. she gets the feedback that the course was good for the first 10 minutes, then irrelevant.The ID doesn’t know what to do with this information so right now they are afraid to ask because RAT do not allow them to measure any aspect of the content created. – That’s a bummer.If necessary, touch on Level 3 & 4. Difficult and not scalable.