Performance monitoring: When is 'good' good enough?

4 June 2020
SHARE

More and more organisations are keeping a close eye on the performance of their IT landscape. After all, it is very important to minimise downtime and to create the best possible user experience. These performance monitoring tools produce interesting data. But how to interpret it? What does that 4-second response time actually mean?

As of 2017, performance monitoring is indispensable for major organisations. Poor performance results in harm to one’s image, complaints from users and even loss of revenue. With the aid of passive or active monitoring, you can clarify what the end user is experiencing. It is also clear that good performance gives ‘more’. But the more performance monitoring tools are put in place, the more difficult it becomes to interpret this data.

CREATING CONTEXT

In practice it is found that system administrators often compare performance figures with what was agreed in an SLA: is the supplier meeting the requirement or not? However, what is much more relevant for the (IT) management is what it means that this transaction takes 4 seconds. After all, an 8-second response time is adequate for some transactions, while other applications are demanded to respond within two seconds. Think, for example, of logging into a VDI environment just once per day; it is no problem if that takes a little longer. But when it comes to a web store? In that case, research shows that 40% of visitors give up if the loading time lasts more than 3 seconds. In short: good response times depend on their context.

FROM IT PERFORMANCE TO BUSINESS PROCESS

It is important to look not only at individual IT components but also the bigger picture: what role does an application play in the business process? For example: if the front-end application of an ordering process works well, the customer can manage on his own, and the call centre has to deal with fewer questions over the ‘phone. And when the call centre application is working optimally, the staff can deal with more questions. The ‘order service’ business process therefore consists of individual IT components which have to be seen in their context in a process.

The next step, then, is to quantify business processes in terms of IT performance. Where does performance play a more significant role? What are the customer’s expectations in this matter? When will the business benefit from (or suffer from) a certain performance? In this way, minimum and maximum parameters can be established for each transaction in an application. The measured performance can then be indexed with the aid of the Apdex.

PERFORMANCE MONITORING WITH THE APDEX

The Apdex (Application Performance Index) is an open industry standard for indexing performance. After the requirements and expectations have been defined and converted into quantitative parameters, they are incorporated in the Apdex formula. In this, response times for each transaction are distributed over three bandwidths: ‘satisfied’, ‘tolerated’ and ‘frustrated’. For example, does logging into application X take a maximum of four seconds? If so, that measurement is marked as ‘satisfied’ and is given 1 point. Does it logging in take from 4.1 to 7 seconds? If so, that measurement is marked as ‘tolerated’ and is given half a point. Measurements of longer than 7.1 seconds are marked as ‘frustrated’ and are given no points. The number of points is then divided by the number of measurements and plotted on a scale of 0 to 1.
These parameters are set for all the individual transactions, with the result that logging in – a one-off action – may indeed take longer than frequent searches by customer name. The average Apdex figure over the process determines whether ‘good’ is good enough. A figure of 0.85 is regarded as good.

FOCUSING ON WEIGHTED FIGURES

Once a weighted performance figure is known, the management acquires new possibilities for directing. For example, it becomes easy to focus on end-user satisfaction of the efficiency of a process: with an Apdex value of 0.85 or more, users can be regarded as ‘satisfied’ and the process can be deemed to be ‘efficient’. Based on the index figure, various applications can be compared to each other so that one can make an informed choice on whether or not to invest in certain IT components.
In short, intelligent handling of performance monitoring tools shows clearly when ‘good’ is good enough. The Apdex functions here as ‘glueware’ between technology, people and processes; the business, IT and management all look towards a single point of truth which is comprehensible and relevant for all concerned.

Contact our experts!

More about this topic? Contact our experts to find out!

Contact our experts ››

A little spark can lead to great things.
You just need to know how to ignite it.

Contact us using the form below or get in touch via +45 33 36 63 00


Get in touch »