Advertisement
Advertisement
Proving Value Analytics
ATD Blog

Proving That Your Training Program Is Successful

Wednesday, October 28, 2020
Advertisement

Has your L&D team ever been asked to show its value? Have you ever proposed a development solution and heard the C-suite say something like, “Why should we spend the money and time on that?”

In his October 2020 issue of TD at Work, “Evaluate Learning With Predictive Learning Analytics,” Ken Phillips provides a methodology you can use to show the difference that a training program makes.

Why Training Sometimes Fails

We know that not all of our training translates into changed behavior on the job. That cost of scrap learning—between your time, the cost of the training program, and employee time—can be considerable. There are many different reasons behind those lost dollars and time, for example:

  • Program design. How relevant is the information you conveyed during the program to learners’ jobs? Will the new skill or knowledge contribute to the employee’s career? Does the behavior contribute to a trackable business metric?
  • Learner attributes. Do employees think they’ve been given an opportunity to learn challenging new things? Do they believe they can perform the new skill on the job? Are they motivated to apply the knowledge?
  • Work environment. Is the participant’s manager supportive of the learning program? Will the manager provide an opportunity for their direct report to apply the new skill in their role? Will colleagues support the use of the new information?

These questions point to reasons why training may not translate into changed behavior and learning transfer.

Advertisement

Discovering the Root Cause

With so many reasons why training may not be put into practice, how does an L&D professional know what factor to address?

After selecting the appropriate training program to analyze—one that is high profile enough to get the attention of the C-suite, has enough participants, and is designed and developed internally—solicit input from L&D teammates, managers, and leaders close to the program participants to collect data that will allow you to estimate the amount of scrap learning.

After you have estimated scrap learning within your targeted training program, select a cohort to gather data from. This will inform you about whether the learners believe they have the knowledge and skills to perform well when back on the job, feel they are supported by managers, and think the training program is relevant to their jobs and careers.

Advertisement

From the data collected from your training cohort, you can calculate scrap learning. The answer to this question you ask of learners will be key, says Phillips: “If you are not applying 100 percent of the program material back on the job, what obstacles prevented you from applying what you learned?”

In looking at survey scores from multiple learners, you will get a sense of trends—do many learners think the training program lacked relevancy to their jobs, for example? Use your best L&D skills to design a solution or solutions. After an appropriate time, you can gauge whether your solution has made a difference. Make sure you correlate evaluation results with learner application index scores, so you maintain credibility with the C-suite.

When you feel comfortable in the reduction of scrap learning—that your solution is saving time and money and learning is being transferred on the job—you can share your findings with the C-suite, using a format and the degree of detail that your senior leaders desire.

You will have successfully proven the TD function’s value!

About the Author

Patty Gaul is a senior writer/editor for the Association for Talent Development (ATD).