Back to top

News

Customer Experience (CX) Programs: Past, Present And Future

02nd May 2019
Customer experience programs

The article below is originally published on Forbes.

I recently took my car in for servicing and got it back the same day. But after less than an hour, the problem reappeared. I took the car back, and after some delay, the issue was resolved. While I was collecting the vehicle the service associate informed me I would be receiving a survey, and asked if I would kindly give him a 9 or 10.

After sharing my story with friends, I realized that my experience was no exception, which prompted us to launch our own survey on the premium and luxury automotive segment. We found that the customer experience (CX) with most brands was poor, yet most of them were boasting about their high customer satisfaction rates. This inevitably led to the conclusion that CX measurement is flawed.

Feedback Overload

Over the year, the actual customer experience with many brands has remained the same. If one thing has changed, it's the number of feedback requests consumers receive on a daily basis. However, the majority of people who end up answering are either very unhappy or very happy clients, so brands are left with highly polarized results of little usefulness.

CX Measurement Tools

The CX measurement tools available today come in different sizes and shapes and accommodate all budgets but often miss two key points:

 What do you really need to measure?

 How will you effectively leverage the data?

We need to remember one thing: However sophisticated, no single tool can measure it all. A sound approach will involve using an assortment of methodologies that complement one another.

Start with your customers.

Before measuring anything at all, it's a good idea to start with your customers, considering things such as what matters most to them, what the key touch points and overall journey are, and where the opportunities lie.

Coming back to the automotive industry, we know that a critical touch point for most customers is the test drive. Yet most brands do not measure how it was done and if any follow-up took place in a timely manner.

One thing I find striking is that most brands begin by picking a tech solution for their customer experience programs, then struggle to make it relevant to the customer journey and the team’s daily work. It appears that some leaders are more interested in hitting targets (i.e., a high NPS) than in genuinely understanding the key drivers of customer perception.

Choose where you want to excel.

Define your peak moments, which is when you want to surprise and delight your customers, and the last moment, as it is what they will remember best. This is where you should channel your resources and what you should strive to measure.

A trip to Disneyland, for example, usually means long queues, average meals and mostly boring rides. But it also comes with peak moments such as the pictures taken with the Disney characters, and in the end, fireworks are what you will likely remember. Whether we feel good or bad is not determined by the actual experience but our remembered experience.

Give 'lag indicators' a rest.

More often than not, brands measure lag instead of lead indicators. A lag measure (i.e., NPS) reveals whether you have achieved a goal, while a lead measure signals if you are likely to do so. For example, to achieve a high NPS, a retail team might need to perform role-plays on welcoming the customers. The number of role-plays and their quality would be one of your lead indicators.

Since lead indicators track the critical activities driving or leading to the lag measure, lead indicators predict its success and are influenced directly by the team.

Simple enough, but brands need to have lead indicators in addition to lag measures. This is crucial because what you measure drives behaviors.

Beware of the dashboard myopia.

Dashboards provide us with a handy tool for coping with data overload, enabling us to present large data sets in a visually appealing manner. However, dashboards and the scores displayed are dangerous for a couple of reasons:

1. They can confine our thinking within a set framework, limiting our ability to explore and break free of the dashboard and KPIs.

2. They push teams to operate under a fixed mindset. The scores thrown at them do not say "not yet" or "nearly there" but deliver a judgment instead -- "aced it or not." They typically do not show the effort or focus on the process of getting there, only the final result. Is it any wonder then that teams end up cheating to get high scores?

Focus on the team’s actions.

The key is not to obsess over NPS, CSAT or any other CX index but focus on the actions that will help teams deliver good customer experience. You could facilitate this by role-playing, sharing best practices, or walking in the customers' shoes.

Bring your team on board.

Unless your frontline teams are part of the process, fully on board and determined to deliver a truly unique and inspiring customer experience, little will happen. The retail mantra used to be "location, location, location." I say it’s time to change it to "people, people, people."

Let’s Recap

As we have seen, it's critical to start with the customer instead of the technology, to contemplate the customer journey in its entirety, and to measure the overall experience by using different tools. Focusing on the drivers of good CX by having lead measurements will help improve your lag measurements.

And while dashboards are important, we need to ensure they do not trap us into a rigid mindset where only the end result matters while the efforts expended to get there are overlooked. Lastly, we need to involve our teams in the process and regularly seek their feedback in a safe and constructive way, which will ultimately inspire them to deliver a memorable experience to our clients.

Christophe Caïs

 

 

Christophe Caïs