Free Downloads

MindMaps:
ISO 20000: 2011
and
ITIL 2011 MMap
 

Templates:
Request for Change (RFC) Template

Major Incident Report Template

Posters:
ISO 20000/ITIL Timeline poster

    

Sponsored Links

 

Google

Jan 7, 2011

Customer Satisfaction Survey: What Methods To Use?

How to gather customer satisfaction data? What methods are there? What ITIL says? What methods will work for you?

USA Today has come out with a new survey - apparently, three out of every four people make up 75% of the population.
David Letterman
 
In my last post I spoke about Customer Satisfaction. OK, but how to gather data? How to interview customers? How to get the best results in surveys and help educate the customer about the proces?
Here I give you a list of basic ITIL methods for gathering customer satisfaction data and my coments of its worth in my company. Mind you, we are in Managed Service business, so my grades are biased.

If you are a small IT in  government agency or an IT in Airline company, you may look for different values. Still, this will give you an idea.


Customer Satisfaction


After-Call Survey
Callers are asked to remain on the phone after the call and then asked to rate the service they were provided.

This method requires that you have some kind of telephony automation like Interactive Voice Response (IVR). The thing with this method is, however you set the system it will often be despised by the customer. Most of the people don’t like to hear recorded messages and even more people hate moving down the menu tree with phone keyboard. Outcome? Only extremes: customer which are very satisfied and wish to reward an agent by giving him good grades or, more often, a very pissed off customer who will go out of his way to express his low level of satisfaction.

Call Center organizations are doomed to use this technology since phonecalls are the only way of communicating with the customer. In ITSM there are usually more ways to talk to customer.

My worth: 1/5


Mail Surveys
We never considered sending mail interviews to customers as an option. Not in this century at least. What kind od IT-related person would fill the received paper form and send it back to us? Do we need a feedback from this kind of person? No, thanks.

With e-mail it's maybe slightly better. We used to have a ServiceDesk application which would send an automated notification on incident resolve event, asking the customer contact to reply in an email grading their satisfaction with speed and quality of resolution by grades 1 to 5. So lame. Response percentage of 0,05% was enough to pacify our ISO9001 auditor but it was next to worthless to us otherwise. One good thing was that receiving a small number of answers in an unstructured text email made the result processing much easier :-)

My worth: 1.5/5


Group Interviews
In this method, customers should be gathered in small groups and interviewed together. This can maybe be somewhat useful in a group of people working together, even then their answers will be biased and dependent. Possible value I see in this one is that they will help to develop communication between customers of different knowledge levels and enhance collective process knowledge.

In my organization we try to keep different SLA customers as far from each other as possible. We do not conduct group interviews but we do organize ITSM workshops here and there for our customers, feedback is usually very good.

My worth: 2/5


Phone Surveys
Some time after the interaction with Service Desk, an independent agent calls the customer.

You are in the middle of your monthly report, juggling a database and three spreadsheets, and some guy calls you to grade your experience with Service Desk from ten days ago? You will brush him off. OK, maybe not the first time, but the next, for sure.

So, if you want to get the results with this method, you need trained people with strong communication skills and good opening lines. Also, a good idea is not to have surveys for all calls, to let the customer rest from you periodically. If they anticipate your call, they start thinking up good excuses to brush you off. Surprise them periodically.

My worth: 2/5


Personal interviews
Surveying person is usually Incident Manager or Service Desk manager, while the Customer representative is a person responsible for SLA on their side. These can be contact persons on Major incidents after the post mortem reporting, but that’s another story and this should be defined in Major incident procedure.

These interviews are usually periodic and should be conducted at least annually (usually they are, during a contract renewal), but more preferably on quarterly basis. During these, the two parties communicate customer needs and help focusing Service Support towards things important to customer. So the questions in a survey should be aligned with Service Desk and Incident Management KPIs.

These are of great value to us.

My worth: 4/5


Online Surveys
We worked with several Service Desk automation tools, different vendors. In the end, we developed a Web application which enables us to send different surveys to our customers. They can be Ticket related, periodical or Ad Hoc surveys with predefine templates of multiple choice and freeform answers. We like it so much that we host it to other departments of our company. And we do not sell it, it’s that good :-)

Customers fill it in their own convenience (ok, we sometimes call and beg them to fill surveys), and after they submit the form they are automatically in the report.

Web-based surveys are a life-saver to us.
My worth: 5/5


Other Methods
Blogs, tweets, wikis, forum discussions and chats are all nice to have and, if maintained well, they will influence customer experience positively.

However, it is next to impossible to mine a structured measurement of customer satisfaction out of them, and they shouldn’t be used for that purpose.


Additional About Surveys

Scoring
It will be best to define simple questions with multiple choice number ratings. How many different grades? For fine gradation in longer interviews we use grades 1-10. For faster, ticket-focused surveys we opted for a coarser gradation. We used 1-5 for a long time, but a lot of the grades were distributed to 1, 5 and 3. Eliminating the middle value and shifting to 1-4 made our customers think more. We still use (1-4) system for most surveys and we are satisfied with it.

In addition, it is often nice to put an optional comment field under every question. Doesn’t cost much, and sometimes customers leave very useful info there.

How many questions?
You certainly don’t want to scare of the customer with a bunch of questions. OK, annual interviews can be longer, and internal ITIL/ISO20000 audits and assessments can gave a gazillion questions, but survey on incident performance can have only one (how satisfied are you?), ideally three, or maximum five multiple choice questions. You fish for quantity here.

Anonymous or named?
Some customers will answer an anonymous survey rather then named. Well, their problem. This is one case where customer is not always right. You want to know who is satisfied, who is not, so you can decide on your further actions.


Related posts:

Customer Satisfaction in ITIL Service Management: Do You Get It?
Elementary facts about ITSM Customer Satisfaction - How bad do we need it in IT Service Management? Where is it mentioned and where is it dealt with in ITIL V3? How do we manage it in real life?