Menu

CEB Blogs

Topics

Sales & Service

Working with The Customer Effort Score

By Corey Stout

As you many of you know by now, CCC believes that a low effort customer experience is the cornerstone of customer loyalty. 

The strongest predictor of loyalty is a low effort customer experience; if customers are forced to put forth high effort, they’re 96% more likely to be disloyal, whereas only 9% of customers with low effort are more disloyal.

In following our own low-effort customer experience advice, CCC created in 2008 the Customer Effort Score, a customer experience metric that accounts for the ease of customer interaction and resolution during a request, so that you can easily track effort.

 CCC Members: to find out more from CCC and your peers, I encourage you to register for our teleconference on the Customer Effort Score (CES) on March 16th and March 17th.

We’ll discuss with a panel of members how to implement, use, and demonstrate the results of the Customer Effort Score.

Here’s a preview of the topics we’ll discuss:

Implementation of the CES:

CCC recommends that companies ask the question, “How much effort did you personally put forth to handle your request?” We advise this particular question – the original customer effort score – because it is validated by 55,000 customers and we have benchmarks that companies can use to evaluate their results.

That said, we’ve certainly seen companies experiment with the wording of the question and the recommended 5-point scale to ensure consistency with other corporate survey questions. In one example, B2B companies occasionally express concern that some customers are confused by the CES wording since certain service-related tasks are part of their reps’ jobs.

Use of the CES:

Regardless of phrase/scale used, many companies find CES to be an incredibly informative and helpful metric. It has helped to:

  • Demonstrate project success – Decreasing effort scores highlight the effectiveness and success of certain changes.
  • Influence rep behaviors – As reps understand the importance of effort, they focus more on experience engineering.
  • Detect process fixes – Identifying specific call types with unusually high effort scores can indicate potential areas for process improvements.

Demonstrate the Results of the CES:

There are two primary ways that companies tie CES results to financial returns:

  1. Calculate disloyalty risk among high-effort customers to show how reducing the number of high-effort customers can improve repurchase. We find that customers who exert “high” effort (4 or 5 on the CES scale) are 61% less likely to repurchase and 23% less likely to increase spend with an organization – as compared to the average customer. Use existing data regarding the percentage of customers who renew contracts annually or your general customer churn rate to get an average repurchase rate, and then figure out the expected loss in repurchase from high effort customers specifically
  2. Tie changes in your actual CES metric (e.g., one-point or one-tenth of a point change) to loyalty outcomes. With any one-point increase in CES (a 20% increase in your score), you could see a 14% decrease in intent to repurchase, for example. You can tie this data to any statistics you have on customers’ repurchase rate to illustrate the monetary returns of CES. 

Looking forward to catching you all on the virtual side to hear more from your peers on their use of the Customer Effort Score!

What would YOU like to know about the Customer Effort Score? We’ll try to get your questions into the queue during the Webinar.

9 Responses

  • Fiona Alder says:

    Will this web cast be available after the event?
    The biggest issues we have seen are an inability to translate the wording effectively into common languages. It just does not work. In addition testing with consumer customers shows that consumers struggle to understand the question as it is written.

  • Coryell says:

    Fiona, yes, the replay will be available to members on our site about 4 days after the sessions conclude.

    As you mention and our panelists will acknowledge, language translation can be tricky, but since the question you propose will obviously track effort at an aggregate level, getting an initial baseline understanding of current effort levels among customers will be helpful. For the most part, the companies we see using the CES question typically start by asking the overall effort question we propose. That being said, we have heard variations of the CES question from other companies. For example, “Compared to what you expected, rate the effort” or “How satisfied were you with the effort?” As companies become more comfortable with the metric and see longer-term data trends, we start to see more companies asking more targeted effort-related questions.

    Out of curiosity though, what question have you settled on?

    And to those reading, what has worked for you?

  • Mark Nuckols says:

    We’re working with a contact center client that wants to implement CES in their post-interaction customer survey, but wants to reverse the scale so that it’s consistent with other questions being asked (i.e. 1 is bad, 5 is good.) While I understand why they want to do this, I fear that it might introduce as much confusion as having different scales on the first question vs. the others (e.g. how can 1 = high effort, and 5 = low effort? That feels counter intuitive to me.)

    Does anyone have experience delivering the CES question in a survey, with others that have a different / reversed scale? And if so, any guidance or pitfalls to avoid? Also, does anyone have experience reversing the scale of the CES question? And if so, any guidance to share?

  • Brad says:

    Mark, that’s a great question and one that we’ve heard a lot before. We at CCC definitely know of companies who have reversed the CES scale and had success in doing so, as they found that customers can follow it well. The key is to be clear in indicating the switch in scale before the question. Setting expectations helps to avoid customer confusion while responding to the survey. For example, companies may say “this next question is a bit different…” before switching to questions unlike those that preceded.

Leave a Reply

*

 

Recommended For You

The Virtual Sales Force You Didn’t Know You Had

Whether they know it or not, most B2B companies already have something of a virtual...

Close