Methods & Tools

How to measure IT user satisfaction

by Karsten Tampier

Conventional, in-depth surveys of IT user satisfaction often return average values and take a long time to conduct. They need to be supplemented with short, quick analyses.


In the past there was an annual user survey, today the exercise is called "Listening Strategy": the more the customer’s happiness gets the focus of the IT organisation, the more important is his holistic satisfaction. As a management tool with significant economic benefits, an employee survey can generate momentum for initiating targeted improvements. After all, business units should be supported by IT in the best possible way to achieve their goals.


Are full IT user surveys still appropriate?

Conventional surveys of IT users are still used to track long-term trends and to see in which direction the organisation is moving, but it is becoming apparent that large surveys are being purged of questions which could be better addressed with dedicated "pulse checks". The rationale behind this is because topic- or department-specific analyses require a shorter cycle, especially in agile environments. Customer surveys have to get to the point more quickly.

What types of surveys are suitable in IT?

User satisfaction is often surveyed after a certain number of closed tickets. Transparency about project progress and insights into the design of the user-centred IT workplace can be gained with short pulse surveys or mood checks. As the use of software continues to gain in importance, usability or user experience is also evaluated with key figures. Here, for example, the Usability Metric for User Experience (UMUX) is employed. In addition, companies continue to determine the sentiment every 12 or 24 months with an overarching, traditional survey. It focuses on the most important points of contact with the IT user, such as the service desk or personal hardware equipment.

Which role do stakeholder interviews play today?

Focused interviews with representatives of individual departments or divisions show directly where things are not going well - whereas in conventional annual surveys, criticism could disappear in an average value of relatively satisfied customers. Focused interviews can be set up, conducted and evaluated within a few weeks. Example: what is going on with the CRM application and its performance in the Cloud? In discussions with the respective department, the problem can be identified quickly and targeted countermeasures can be taken. If, on the other hand, feedback is requested one year after the problem, the impact of such issues cannot be addressed.

What are the challenges of employee surveys?

Through different survey formats, organisations can analyse satisfaction in detail, both in the short and long term. However, there is also a risk of a flood of data and insights which can no longer be processed. Moreover, employees lose interest in surveys if it becomes too much or if the answers have no impact from their point of view. Therefore, it is important to have a clearly defined listening strategy which excludes rash actions.

Sustainability and satisfaction

Employee satisfaction is an important component of sustainability strategies, whereby sustainability at this point means the analysis of satisfaction over a longer period of time. However, it is not only classical full surveys which should be used to analyse the trend and relate the development of satisfaction to decisions taken. Pulse surveys and stakeholder interviews can also provide the basis for a well-founded trend analysis - even if the contents of these formats are usually more volatile. A satisfaction index, for example, is a good way to implement this: a question regarding general IT evolution in the past twelve months can be included in all pulse surveys or stakeholder interviews. If an organisation uses the opportunities offered by these short surveys regularly, changing target groups is no obstacle to trend analysis.

Implementation and feedback from satisfaction surveys

Regardless of the survey format chosen, it is important to inform customers, staff and users about all results of the survey. Findings which are presented selectively or in a one-sided manner make the survey vulnerable to attack; the acceptance of the results and the measures derived from them decreases. Therefore, every good report of survey results should be accompanied by an implementation plan. Participants must feel their answers are being heard. However, the point is not to fulfil every wish, but to show which suggestions and changes will be implemented - and also which will not (yet) be realised.


Do you have questions about the Listening Strategy?

Just send me an email - we will be happy to answer your questions and, if necessary, set up a test survey for you.

Karsten Tampier

Karsten Tampier

For over 25 years in benchmarking, Karsten Tampier knows what a fair comparison looks like. With his team, he is responsible for data analytics at Metrics and thus for the data lake and methodological data consistency in customer projects.