Reading the Tea Leaves: Use Customer Surveys to Effect Culture Change

Posted by on

    

12 Days of Insights 2014: Day 5

To celebrate the holiday season, we’re giving the gift of knowledge! Join us as we share our favorite posts from top technology services thought leaders in our 12 Days of Insights blog series.

Reading_the_Tea_Leaves_with_Customer_SurveysHow would you like to increase your customer satisfaction ratings for six months straight while (not coincidentally!) reducing resolution time by 50 percent? One of my clients did, and you can too, by following these five relatively straightforward steps that link the operational processess of your support team and customer satisfaction. With a little effort and persistence, you'll be well on your way to changing your organizational culture for the better. 

Step 1: Dig Into The Survey

Transactional support surveys essentially yield a number, whether an average rating or a percentage of satisfied customers. While the simplicity of that one number is appealing, it's not enough to identify specific root causes for customer satisfaction and dissatisfaction. Depending on the size of your organization, you can use a variety of approaches. Better, use several approaches and crosscheck the results.

  • Collect and analyze survey comments. It takes a surprisingly small amount of time to read through the comments of even hundreds of surveys and as you do that, themes emerge quickly. Even better, if you follow up on bad surveys (and you should), the additional information gleaned during the follow-up conversations is a treasure trove for this exercise.

  • Conduct a small-scale, detailed survey. Specifically ask customers to pinpoint the critical elements of a successful support interaction. You do not need to inflict the long survey on all customers; a small sample is sufficient, and you can extrapolate the results safely if you administer the follow-up survey to a random set of customers. You can either administer the detailed survey in an A/B test, delivering the longer version to some percentage of customers, or you can use it as a follow-up to the regular, shorter version. In any case, ask customers to rate the various aspects of case resolution on how much they contribute to the overall rating.

  • Correlate survey results and case characteristics. For instance, you may find that longer-lasting cases yield poorer ratings, suggesting that a swifter resolution process would help. Or you may find that cases that close very quickly garner poor ratings, which may be a sign that the team is rushing customers too much and perhaps missing a complete resolution of the issues (or that customers expect to resolve these very easy issues themselves, in self-service). It's very useful to contrast various product lines and geographies, if applicable, since the best benchmark is often internal.

The goal of the analysis is to link your operational processes, what the support organization does every day, with customer satisfaction. You want to find the critical components for your specific environment. They could be response time, resolution time, poor communication during the course of the case resolution, specific products with quality issues, technical holes in the team, scheduling weaknesses during certain times of the day, too many hand-offs, or a poor escalation process. Spend time validating the criteria. You don't want to launch a major initiative to cure, say, a process issue, when in fact it was all about a lack of soft skills, or vice-versa.

Step 2: Strategize

Sometimes a single, simple change will suffice. For instance, if you find that evening coverage is skimpy, you can shift schedules around and voila: happy customers! But chances are that you will uncover deeper root causes that require a more comprehensive approach. For instance, imagine that you uncover that customers are very concerned about slow resolution times a common complaint for high-complexity support. Addressing the issue properly may require several changes, perhaps:

  • More staff, if the issue is purely related to workload. You will know this if everyone's rating is low (and the workload is evenly distributed), or if some engineers manage to get decent ratings, but only those who somehow handle fewer issues.

  • Better soft skills, if you hear from customers that they are not getting clear requests from the support engineers, requiring multiple troubleshooting interactions, or that they have to keep asking for status updates.

  • A sleeker case-resolution model with fewer hand-offs, if customers complain that they have to explain their issues again and again as they get transferred to new support engineers. This is common with overzealous follow-the-sun models.

  • An enforced service level agreement (SLA) with engineering, if resolution time delays are typically linked to bugs.

  • Better backlog management from team leads and managers, if stale cases are often allowed to linger with no activity and no clear next step. This may require better metrics.

At this stage, it's very useful to interview support engineers and managers about how they would address the problem, and to validate their instincts with hard data. It may feel to them that delays occur when they have to wait for a response from engineering, but you may be able to prove that, in fact, most slow-resolution cases don't ever involve engineering. Combine soft data with hard data, always.

Arrange your strategy in layers:

  • Support offerings. This is rare, but perhaps the way your support offerings are structured is creating dissatisfaction. For instance, if you promise support in multiple languages but you have only a very few multilingual speakers, you should either hire more or tone down your promise.

  • Processes. A typical example would be to escalate issues quickly rather than allowing them to linger with a junior engineer.

  • People. A lack of soft skills is often linked to customer dissatisfaction. I'm not talking here about superficial skills such as voice modulation, although that's certainly a good idea, but about deeper skills such as driving the interaction assertively, making clear commitments, and managing cases so commitments are met promptly. Technical skills can also be at play here.

  • Tools. Scrutinize the case-tracking tool, online customer tools, and the bug-tracking tool.

  • Metrics. As mentioned earlier, poor metrics generate undesired behaviors.

Spend time testing and sharing the strategy with the team.

Step 3: Train

Few customer satisfaction gaps can be resolved without some amount of training, especially for soft skills. Insist on hands-on training. Reading one of the myriad customer service books will not trigger behavior changes, except perhaps in the top five percent of support engineers who are already delivering a good customer experience. Ask the managers to coach the support engineers on key behavior changes. It takes multiple weeks to undo familiar behaviors and adopt new ones. The managers (and team leads) should shower praise on moves in the right direction, even if the end results are not immediately different. Training managers on how to coach can be a wonderful investment in addition to training support engineers. 

Step 4: Measure

If you did a good job of linking customer satisfaction and

operational processes, you know what needs measuring and monitoring. Focus on the few process areas that matter for your customers. For instance, if resolution time is a critical component of customer satisfaction, you will want to track aging cases, the percentage of cases resolved within a week, say, as well as cases without recent interactions.

Create a set of metrics that help managers monitor resolution time. Educate them on how to use the metrics to spot issues and to coach their teams. I once worked with a support manager who looked strictly at “touches” on cases to determine whether progress was being made. Let me assure you that his team members had become very adept at adding meaningless notes to their cases to stay off his radar, not creating any value for customers, to be sure, but doing a bang-up job of window dressing. You don't want to do that.

Step 5: Repeat, Repeat, Repeat

If you've defined a good strategy in Step 2, you should see results almost immediately as you implement the strategy results in the operational processes, that is, but not necessarily customer satisfaction, as it will take a while for customers to shift their sentiment. Keep at it, and keep analyzing the linkages between customer satisfaction and processes, as they may change over time.

Conclusions

Can you link customer satisfaction to your operational processes? If not, analyze and validate the correlations. Do you have a comprehensive strategy to establish or reestablish the right processes and behaviors to generate customer satisfaction? If not, create one. Invest in those processes that matter most to your customers.

Checklist: Six Steps to Define and Track Utilization

About the Author

Francoise TourniaireFrancoise Tourniaire is founder and principal of FT Works, a firm that provides diagnostic and implementation assistance for customer success programs and all aspects of customer satisfaction. For more information visit www.ftworks.com or contact Francoise Tourniaire at FT@ftworks.com.

Register for TSW Las Vegas 2016

Topics: support, metrics, customer satisfaction, organizational culture, customer support, 12 Days of Insights, optimization

Comments

Subscribe to the Blog

Posts by Topic

see all

Follow TSIA