Fall 2019
Issue Map
Advertisement
Advertisement
Net Promoter Score: Take It or Leave It?
CTDO Magazine

Net Promoter Score: Take It or Leave It?

Monday, September 16, 2019

The argument: It's time to retire the Net Promoter Score.

According to NetPromoter.com, the Net Promoter Score is a tool for measuring customers' experience and for predicting a business's growth. But beyond a gauge of business success, it also can serve as a form of feedback on L&D training programs and events. While this proven metric has transformed the business world and now provides the core measurement for customer experience management programs around the world, recently, naysayers opine that NPS should be retired as a measurement tool. Are they right? Or is there value for it either for businesses or L&D?

Advertisement

Pro

Ron Shevlin

Director of Research, Cornerstone Advisors

Now is the time to realize that American businesses' favorite metric—the Net Promoter Score—is nothing more than management snake oil. It's  mind-boggling  that so many companies rely on NPS as a key management metric when you consider these aspects where the score fails:

  • NPS doesn't explain why a customer would recommend the firm.  Let's say a bank finds out that 10 percent of its branches score much higher than the average on the NPS and that 10 percent score much lower. What has the organization learned? Nothing. Some may argue that it provides clues as to where to dig in, but it would be more useful to find out the root causes in the first place.
  • NPS doesn't account for consumer demographics. Younger consumers typically refer products and services they like more often than older consumers do. If a company's NPS increases from one year to the next, was it because the firm improved its products and services or service delivery, or did it simply reflect an underlying change in its customer base's demographics?
  • NPS can incentivize undesirable behavior.  One executive told me about an interaction he had picking up his car at the dealer's repair shop. The shop manager told him, "If there's any reason you wouldn't check off the ‘likely to recommend' box on the customer satisfaction survey, please let me know before completing the survey." Do you want your firm's personnel asking customers to say they'd refer the firm to friends and family or doing the things that  earn  a referral?

Those points should be enough to banish NPS from the slate of metrics that management uses. But an even better argument for retiring NPS is that it measures intention, not behavior.

It's 2019. Are Google, Facebook, and Apple collecting attitudinal data and inferring behavior from it? Of course not—they collect behavioral data and infer attitudes from it. Many companies even determine bonuses based on NPS. Using that logic, I should have received a significant bonus last year because my intention to bring in many sales was—on a scale of 1 to 10—a 10. Pay no attention to the fact I fell way short of the goal; my intention was there.

The underlying premise of the NPS metric is that referral intention is an important attribute or trait of a loyal customer. The focus on NPS was worthwhile from the perspective of getting executives to look at more than just purchase behavior. After all, in many industries, consumers don't have a need to get a new product (or service) every week, month, or even year. But loyal customers can do more than just stick around—they can promote the company.

But intention to refer is not the same as actual behavior. The mechanisms for tracking referral behavior is available (and many firms track that).

Companies should replace NPS with a Referral and Purchase sheet that captures the percentage of customers who provided referrals during the time period and what percentage of customers added new products (or expanded their relationship). Using a RAP sheet involves:

  • creating a score—the RAP Score is simply the percentage of customers referring times 100, plus the percentage of customers adding products multiplied by 100, all divided by two; if 100 percent of customers referred and added products, the RAP score would be 100.
  • comparing results—as with NPS, look at how various customer segments compare to each other and how different parts of the organization compare.
  • expanding the analysis—over time, expand the RAP sheet to include more detailed levels of referral and purchase behavior, such as how many referrals customers provide and how many new products or services they add.

Con

Rachel Hutchinson

Director of L&D, Hilti

Net Promoter Score is a single question: "How likely is it that you will recommend X to a friend or colleague?" This metric is more frequently used in L&D, and you often find it either standing alone to provide feedback to organizers or included as the first question in an after-event survey.

Advertisement

NPS has gained popularity because executives often see benefit in it and learning practitioners are joining in. Eivind Slaaen, senior vice president of HR and head of people and culture development at Hilti Group, states, "I like NPS as it challenges me to take action. NPS is a ‘brutal' measure that tells me if my customers find my learning solution effective. All of us only recommend things to friends that we can stand fully behind."

Many executives appreciate this reference to brutal feedback, because NPS is a straightforward question that takes into account all the minutiae that occur during a program or event and states how the sum of the experience affected the participant. The measure is then inherently satisfying to executives who do not want to see that the food was good or the trainer was bad; they want to see whether people will return and recommend others to return based on the overall experience. L&D practitioners may need or want to know more details about various trainer facilitation styles or how they can improve registration experiences, but executives rarely concern themselves with that level of detail.

Another benefit of NPS over traditional after-event or program surveys is that the metric is brief. People know they only need to answer one question, and they know that their answer will affect the overall rating. With many surveys containing five, 10, or even 20 questions, learners often drop out of the survey midway through, especially if they believe that their answers won't change the environment for them or for a future participant. At Hilti, our surveys are anonymous, but some organizations include the option for participants to add their name, enabling promoters and detractors to conduct further in-depth research to create future modifications. I have seen this most frequently with organizations that release a minimum viable product and want to adapt a final iteration with user feedback.

All that said, I am not advocating that organizations only use NPS. At Hilti, we define success criteria with stakeholders at the beginning of every project with a simple request: "In six months, if we are successful, describe to me what your team/organization/situation looks like and how they are performing." The responses to that statement ensure that we have observable and measurable success criteria.

Rather, I am advocating that NPS is a simple, succinct, executive-friendly temperature check of your solution. In one of our talent development programs, we use NPS at one and two years after the program completion. In this way, we get to see whether participants are able to connect their experiences in the program with their career progression and challenges that they overcome along the way. We do this because that program is intended to provide more long-term rather than short-term tools for their continued success as they move through the organization. In this program, which is application only, people often nominate team members as they move up through the organization. Knowing when we need to change things helps us to communicate these changes to future nominators.

NPS will remain a controversial method, but I encourage talent development professionals to give it a chance. Take a look at some brutal feedback and let it drive you to improve. As author and consultant Julian Stodd said in a recent interview, "Recognize that engagement is earned, not demanded." Be bold—earn your learners' engagement.

Read more from CTDO magazine: Essential talent development content for C-suite leaders.

RS
About the Author

Ron Shevlin is director of research for Cornerstone Advisors.

About the Author

Rachel Hutchinson is senior manager of global training and learning at Hilti. She leads a team of global training consultants and project managers for Hilti AG, an international company based in Liechtenstein with 23,000+ employees in 120 countries. She works closely with stakeholders at all levels to define optimal ways to affect results across the organization.