Re: How to review Customer Satisfaction Clause when the Survey forms are not returned
I think it's important to point out that calculating scores can easily degenerate into an exercise of adding apples to oranges and dividing the sum by pomegranates.
And once you have some numeric score, assuming it's a sound number, how do you decide what it means? Is it good? Is it bad? If it's lower than last quarter, is it random variation, or is something actually going wrong?
The point here is that numerical scoring and its interpretation carry at least two risks. First, it's so easy to do wrong. Second, numbers are so seductive they can easily mislead management into doing the wrong thing.
Or even (mis)lead auditors into thinking that the company is 'doing' this stuff when it isn't.
It's a too-easy 'for the auditor' thing too often.
Q: What are you doing about customer satisfaction?
A: Oh, here's our latest survey, look at our graph.
Uh huh.
But I have seen too many nifty looking graphs based on 'surveys' which looked nice, and satisfied auditors. But when I asked the companies what
value they were, what useful information they were getting from them? Nil.
IF you can get a reasonable survey (meaning effective questions) and IF you can get people to return them, and IF you get useful data from them, then I agree they are a useful tool. But they are not the be-all and end-all and they are not the only method.
I've had many long and valuable discussions with my clients about this, and been fascinated about their various unique situations and how they have grappled with this question of how to monitor their customers' perceptions. And no, they most certainly had not only just started to do this stuff when they decided to go ISO!
Ask yourselves the question: why
would someone do and return a survey to you? Why would you? And how many have you dropped in the bin?
I spent 2 years answering customer surveys for my bank when it rang & asked. After 2 x 2 of that (half yearly), I jacked up & refused to do them any more. Same old questions, didn't ask what I wanted them to, didn't allow special cases or comments, and most importantly: I could not see that any changes occurred at all. So why waste my valuable time?
I disagree that qualitative data ain't data. It
is. Intelligent management listens to what their customers say, and take it on board.
I'd a sight rather have a 10 minute discussion with someone I really felt was listening to what I was saying than waste 5 minutes on those stupid one size fits all surveys. OIr a long management discussion about (say) repeat customers and why they still are repeat customers, than try and get said repeat customers to 'fill in a survey'.