Tuesday, March 18, 2008

Technology Surveys

We recently had a discussion about the value of surveys. You know - those postcards or web sites that ask you to rate the quality of product or service you've just received. Typically the questions are framed so that providing any negative feedback is difficult, you always wonder if they are truly anonymous, and figure that no one looks at them anyways.

I did stand-up training for several years and we surveyed the students after every session - typically four hours. These numbers were crunched into spreadsheets that showed our classes were rated 4.85 on a 5 point scale. After a while, the number of respondents was so high, that a negative review did nothing to the overall average. This combined with the fact that most people don't think much about their answers, they just want to get back to their lives, and your begin to realize that the surveys don't really mean much.

Don't get me wrong, I'm a proponent of user feedback, customer satisfaction, and reader-centric writing. It's just that most surveys, and the survey processes don't yield much, especially if bad reviews are 'erased', or skipped because they fall outside the norms. And yet... these are the surveys that provide the most value, and should receive the most attention. We had a case in the past month where one of the technology teams sent an automatic survey out after a service call, and the numbers came back well within the norms; which is to say very high. However; the customer included a remark that they wished the service could have started sooner.

One of our executives, asked what the technology team was doing about the remark, and received a diatribe of how this is not the norm, nobody else says this, defense, defense, defense. In other words - the surveys are worthless. As long as the numbers are high, we'll use them to promote our quality. When we get something negative, we'll spend our time developing all the reasons to ignore it.

In another case, a technology team collects a 10 question survey after each event, and is currently seeing that 90% of the respondents rate the service as either Excellent or Perfect (7 or 8 on an 8-point scale). This team received one survey recently with numbers like 2, 3, and 4. Immediately an analysis was performed and the team concluded that the root cause of the poor evaluation was outside-the-process communication that took place after the event. In other words, the event being rated was fine, per se, but back-channel sniping between participants had left a sour taste in the mouth of the customer.

In this case the technology team had a focused conversation to remind the service providers to stick with the defined process and not to engage in extra-process communication and to leverage the defined process to communicate necessary information. In short, the survey data was used to improve the situation, at least for the next event.

The primary driver for collecting survey data should be to improve the process, not to provide marketing material for how great you're doing. The comments and negative reviews are more valuable than hundreds of smiley faces!

Follow by Email