Thursday, February 20, 2014

Customer Survey Apocalypse

 I recently bought a fairly expensive item. I shopped around, decided what I wanted, and then went in to finalize the deal.

As I was completing the purchase and was shaking hands with the sales person, they leaned in and in a confidential tone told me a story. I would receive a survey through email in a week or so asking about my experience buying the item. The survey would cover a number of topics, including how clean was the showroom, was I told about all the options, was the sales person helpful, etc.

The reason for telling me this, he explained, is that the sales personnel are rated on the responses they receive. What's more, if they receive any responses less than "exceptional", they are considered to have failed. So, if I was happy with the purchase, could I please make sure I marked everything as "exceptional" on the survey?

There are so many things wrong with this experience it is hard to know where to begin. But what spurred me on to write about it is because this is the second time this has happened to me in the past year — dealing with completely different companies and different services!

So where to begin?
  • First, I actually did have a good experience with the purchase and would, without prompting, have given a good review in response to the survey.
  • But having been prompted, I am tempted to not respond to the survey at all rather than deal with the pressure to up my score.
  • I don't blame the sales person. If he is being judged on each survey, why not try to game the system and improve the scores?
  • I do blame the company. They have soiled what was a pleasant experience and made it seem seedy and somehow underhanded.
  • I understand the need to stay in touch with the customer experience. And as problematic as they are, surveys are one mechanism to achieve this. Especially for large national or multi-national corporations
  • But having said that, conflating customer awareness with personnel management is a disaster waiting to happen!
 I know people (and I am not far from being one myself) who would, under some perverted sense of justice, actually lower their scores on the survey in response to the request to score high. There are others, obviously, who would go along to be friendly and up the scores, regardless of what they might have given as ratings without external prompting.

Survey results are problematic enough to start with. 10% return on all surveys sent out is often considered a "good" percentage. Add to this limitation the fact that often those who hold the strongest opinions — either for or against — are the most eager to express their opinion. And now you further skew the results by having employees pushing participants to either extreme.

So rather than get an accurate (or semi-accurate) picture of the customer's experience,  they have invalidated the entire process and skewed the results. At the same time, they are using a completely spurious method for assessing their employees. Don't assess your customer facing employees on what the customer says! It is like telling the cleaning service their pay is dependent on how clean the bathroom sink is. You can be sure the bathroom sink will be clean -- even if the rest of the house isn't touched!

If you are going to rate customer-facing employees on what customers say, base it on the aggregate, not on individual scores. You are still going to sabotage the accuracy of your customer responses, but at least you would reduce the damage. And, possibly, not disillusion the customers you rely on to spread positive word-of-mouth about your business.



3 comments:

John Piekos said...

The question is: did you go to this place as a result of some published/public rating? I am guessing not.

I view these surveys as internal focused, sales person bonus/rewards. So if you liked the person, vote him up, maybe they get a year-end bonus.

It probably was a car dealer!

Esther Daang said...

I'm curious as to what you ended up doing about the survey.

Andrew Gent said...

Hi Esther,

I started answering the survey, but stopped before completing it because the questions quickly became personally intrusive (such how many people in the household, annual income, etc.). Again, a case of trying to use one survey for too many purposes....