Statistical analysis in a Service environment
Six Sigma and the DMAIC method for quality improvement originated in Manufacturing and the Electronics industry, but over last two decades its use has transitioned into Transactional and Services organisations.
How applicable are the statistical elements of Six Sigma thinking to the service sector and how should they be applied?
The analysis challenge
One of Six Sigma’s strengths is that it promotes “data-based decision making”. Using methodical data collection and applying statistical rigour, the significant factors driving the root cause of a problem will emerge.
In a manufacturing plant with standard procedures and calibrated machines it is undoubtedly possible to use such techniques to isolate and address the most significant factors which may cause defects.
In service organisations however, there is inherently much more variation in the production system caused in large part by the variety of demands that customers have. Rather than there being a finite and deterministic number of permutations as there are with a physical product, a service has almost infinite variety.
Consequently, using statistics to determine root cause is often difficult – there is just too much natural variation in the system. However, that doesn’t mean that Six Sigma thinking has no place in a service organisation, it just needs to be applied and focused in different ways.
Are we in control?
When is a problem not a problem? It is amazing how often a serious event such as a customer complaint can trigger a great deal of effort to solve the problem, only for it to be subsequently discovered that this was a freak event and not typical of how the process performs.
This situation is like your journey to work. Usually you arrive at roughly the same time every day, within a typical window of a time. Then one day an accident on the motorway or a broken-down train makes you several hours late. It’s a serious problem, but not a frequent or repeatable one. What is potentially more concerning is if your journey was gradually taking more time over several weeks, or there was a sudden “step change” in how long it was taking.
Process analysis in a service organisation should be approached in the same way, where problems are identified through a gradual drift in performance over time or a sudden shift that becomes the new normal.
Six Sigma thinking can help identify problems in several ways. Firstly, by helping to identify appropriate measures that relate to Quality, Cost or speed of Delivery. Then using “Control Charts” which track these measures over time and use statistical methods to highlight when a significant change has occurred.
The 80:20 rule
Imagine that the number of calls being received by a service team has been identified as a problem and worthy of further investigation. The range of calls received by the team is likely to be broad and the cause of each call equally diverse. Potentially it is a huge task to investigate and resolve the reason for every call type.
However, Six Sigma thinking can help us focus efforts much more effectively. If the calls are categorised by type it is likely that a small number of categories will account for most of the calls. This “80:20 rule” (or the Pareto Principle) is a frequently observed situation where the “vital few” categories can be identified and studied in more detail, whilst the “trivial many” are discarded.
The Pareto chart, a visualisation of the frequency of each category, is one of the most useful tools that at a stroke can eliminate 80% of the effort associated with problem solving and maximise the impact for the resources and time available.
Avoiding the average
Many service metrics use averages to simplify the display and analysis of performance: Average Handling Time in a contact centre; Average Time To Pay in customer accounts and so on. Whilst such measures are easy to comprehend, they are potentially very misleading. They mask variation in performance which may be considerable and potentially hide large numbers of customers who are receiving sub-standard service.
Six Sigma analysis encourages us to look beyond the average and to consider the range and variability of performance. Imagine measuring how long it takes for a service team to resolve a customers query. If the target for resolution is 7 days we might find that on average queries are resolved in less time and therefore assume that the process is working well.
We could instead ask what percentage take longer than 7 days? Or we could ask what is the range or spread of resolution times? By doing so we may find that a considerable percentage of queries are taking longer than 7 days and some are taking hundreds of days to resolve. Suddenly our understanding of process effectiveness is cast in a new light.
By changing how data is represented and analysed, Six Sigma thinking creates far greater insights into process performance and standards of service.
Have we made a real difference?
Once a solution to a problem has been identified and implemented it is tempting to move on to the next big challenge. Any change in performance is deemed to be a benefit. But how do we know that the change has had the desired effect and how sustainable is it?
The statistical analysis tools associated with Six Sigma can be invaluable at this point. If the new process or change can be tested and compared to the “old way” it becomes possible to make a direct comparison between two outcomes. Analysis enables the change to be evaluated and the statistical significance of any improvement to be determined. By doing so it become possible to say with confidence that any improvement is due to the change and not just down to chance.
Furthermore, the Process Control methods described earlier can be used to track performance over time and to ensure that any improvement is both sustained and built upon.
Find out more ……
Contact us to find out how we can help you apply Six Sigma thinking to improving your business