Advertisement
Advertisement
Document management and data collection system concept
ATD Blog

The Operational Value of L&D Data

Monday, June 5, 2023
Advertisement

It has been 20 years since Karl Kapp first asked how long it takes to develop one hour of training. From 2003 until 2017, we surveyed the industry, asking this question along with various others to gain insight into the most sought-after answer in our industry.

In 2020, to address the changing ways in which the field produces training, we updated the question to simply ask how long it takes to develop training. Now, during data collection for our next survey, the basis for this line of inquiry remains relevant and interest in it remains steady.

Another factor that remains steady is the constant challenges L&D professionals face. If we went back to the research from 2009 and examined the “why” behind the number of hours required to develop a training session, we would find the same story that plays out today. Both internal operational issues and external problems facing the client create L&D bottlenecks.

Clearly there is value in asking how long training development takes and in the data that’s shared in response, but there’s even more value when the L&D function collects its own data around this topic, based on its own needs. Previously, we’ve highlighted how one organization studied and created the first dataset for training-development time that represents a specific industry. We can unlock the full potential of the learning organization, from operational efficiencies to new revenue streams for clients, by using this data as a starting point.

It often feels like an insurmountable task to gather quality data, but it can be done!

What Will the Data Do for Me?

Twenty years after Kapp posed his question, here is what most of our field still faces:

  • L&D leadership does not collect or use operational data to inform their function’s capacity or capabilities.
  • L&D functions are often reactive versus proactive.
  • L&D operations are not understood or deployed equally even by L&D teams.

It’s hard to command a seat at an organization’s decision-making table when the business aspects of L&D are not being addressed by the L&D department itself. This is an issue with systems more than with people.

Our field is 79 years young, and from a business perspective, it’s still a maturing industry that appears slow in adopting innovation. I’m not talking about AI and ChatGPT or the products we produce with technological innovations; I’m talking about the business function of an L&D service.

Business innovation for L&D could be as simple as an organization’s L&D department creating internal templates and style standards. A moderately advanced example would be a vendor in the L&D space cross-training their employees to reduce the risk of project downtime.

Both of these examples demonstrate efficiency in production, consistent quality in learning products, and risk mitigation tactics. These value propositions have immediate benefit for L&D professionals and are also part of the organization’s broader value proposition.

Efficient production and high-quality products create cost savings, not to mention flexibility in the L&D professional’s capacity. Better yet, your L&D professionals will probably not burn out trying to keep up!

However, such data points and operational efforts are often overlooked by the L&D function, which creates a hard sell for a seat at the table. What sway does a static story of being overworked, misunderstood, and undervalued hold compared to a compelling value proposition that aligns to the organization’s strategy?

Advertisement

So, how do we get the kind of powerful data that makes a case for L&D’s organizational value-add? We don’t want just any data—we want standardized, reliable, and systematic data. Which not only takes time but requires leadership, buy-in, and an operational framework, complete with policies and processes, to collect.

Gain Support From Your Leadership

Ask your leadership to take on this challenge for your department! Explain that the results will provide:

  • Insight into bottlenecks
  • Comprehension of department perspective
  • Identification of gaps in processes, policies, and procedures

Bottlenecks and scope creep—such as hurry-up-and-wait, repeating the review cycle too many times, or a nonresponsive team member—significantly inflate training development budgets and project timelines.

We also cannot assume that team members, clients, subject matter experts, etc. have a consistent understanding of L&D work and products if the L&D department doesn’t have a shared language model. Even if your team comprises only credentialed instructional designers, you cannot guarantee that they all value and approach analysis in the same way.

Because L&D teams are often in reaction mode, it may seem like there’s no time to create documentation. But if we believe that, then we believe we have no ability to standardize processes or determine policies.

What leader would want that?

Create a System to Collect Data

This is where pushback comes in quickly from clients. Whether it’s a belief that tracking data about development hours signals to the team that they’re being scrutinized, or an argument that their work is so volatile that it’s impossible to create a standard—I’ve heard it all.

In the survey we use to collect industry data, we set a standard and define it. We put this up front to help folks rethink how they report their hours in response to our inquiries.

Advertisement

When you’re internal to your organization, you get to create the standard. What work categories do all team members share? For example, your team’s work may include:

  • Non-project efforts on a daily basis
  • Professional development time afforded to everyone
  • Administrative elements for the projects they’re working on (meetings, emails, and the like)
  • Maintenance on learning products, platforms, and systems

These are just a few possibilities. Consider how a developer spends their time versus a manager—what categories would they need? Perhaps client management, team management, short-term planning, and so on?

Choose your categories, define each, and provide examples. Be comfortable coaching your team to ensure consistency. They may not record everything perfectly at first, but open discussions can help to refine how the categories work for your team!

Share the Journey In even six weeks’ worth of data, stark patterns can begin to emerge. Bring those insights forward to the larger team. This helps organizational leadership understand that data collection isn’t personal; it’s the business of optimizing and empowering a department.

Open discussion can lead to the qualitative insights you need to effectively tell the story of the data. I recall the end of a project when I shared with my team how each individual had contributed to reducing the overall project timeline. We’d used a new process we had just developed based on previous work, and even while we were refining it, the client had put that process to the test with an expedited timeline.

The real win was sharing how we had not only met but surpassed the client’s rushed deadline, while also providing an error-free solution for them. My team realized that we’d maximized the process we’d created and adapted to the challenges that inevitably cropped up, and that their individual skills had culminated in success.

I could have merely thanked the team for a job well done, but this way, they understood the value of their contributions toward operational success as well as project success.

Concluding Thoughts

It takes time and experience to gather quality, valuable data. However, that shouldn’t deter beginners from collecting data or using it to inform decisions. Refining the data collection process and the data itself for standardization and reliability is part of the process.

If you need a starting point, just have your team collect data for three weeks to a set of standard categories you create. Meet after three weeks and discuss the data and the categories you used with the team. Seek to determine where there is disparity and commonality. Resolve the disparity, reshape the collection approach as needed, move forward for three more weeks, and repeat.

In nine weeks, you should have a fairly firm standard developed that yields consistent input. Of course, pay attention for deviations and outliers.

Need an example? Great, contribute data on how long it takes your organization to develop training to our current survey!

About the Author

Robyn A. Defelice, PhD is a consultant, author, and presenter with over 20 years in the L&D field. She is the co-author of Microlearning: Short and Sweet and has been researching How Long Does it Take to Develop Training for over a decade.

Robyn’s passion comes from working with L&D leadership and teams and supports them in taking their seat at the decision-making table. She strives to empower L&D professionals using her adventures in meeting client challenges, from corporate to manufacturing, government to higher education, and non-profits to startups.

She enjoys presenting to ATD chapters and sharing on her three favorite topics: problem-solving capabilities, microlearning, and learning operations. She is also currently serving as the VP of Programs for ATD Central Pennsylvania.