The Problem with Employee Churn Modelling
Background
As a response to my recent article on Campaign Management, a friend of mine had asked if I could write an expansionary piece about the bad thinking and bad practices in how organisations solve employee churn. And so I dedicate my 12th article to unpacking the data analytics approach behind this perennial organisation problem. This article can also be seen as an adjunct to my recent article on HR’s digital and data literacy.
Without fuss, we can define employee churn (I personally prefer the word ‘attrition’) as the phenomenon of employees leaving a company while they are still organisationally valued (as opposed to job redundancies and firings). As human resources are seen as assets, HR and organisation leaders are keen to maximise the returns on these assets, and losing them unexpectedly causes all kinds of business and economic disruptions to the organisations. When HR is asked to embrace data analytics / data science, their go-to use case is oftentimes the solving of employee churn. “Can I predict which employee is likely to leave the organisation so that I can intervene early enough to prevent it from happening?” That’s the classic setup of the problem statement. There in the problem statement itself lies the bad thinking, and the bad practice that will then ensue. First, there is the flawed assumption that you can successfully intervene even if you knew which employee was intending to leave. Secondly, there is the assumption that what constitutes a ‘valuable’ employee worth saving is universally understood by everyone.
Can We Stop The Rain?
A common mistake in data science is the belief that most problems can be solved if we could accurately predict outcomes. The missing piece in the conversation is the intended treatment on the predicted outcome, and that, from my experience, is a very critical piece of the solutioning. Broadly speaking, there are only two types of intended treatments on the predicted outcome — either you are going to ‘ride along’ with the predicted outcome or you are going to intervene and try to prevent it from occurring. So, exploitation versus intervention of outcomes.
An example of exploiting the outcome is when you predict who is likely to have a particular product need, and you leverage that to put out a timely and attractive offer to be amongst the first to fulfil that need. Predicting the weather is the same thing; you can’t change the weather but you can react to it by making personal adjustments. It should be obvious to you that if the intention is to exploit the outcome, then investing in the accuracy of that prediction is the right thing to do. The effort is in getting the timing and product availability right.
Now consider the case of intervening the outcome. Predicting the likelihood of an employee to attrite would fall into that category. Compared to exploiting the outcome, the effort required to prevent or change the predicted outcome is going to be significantly greater! In addition, the predictive model doesn’t carry any information value or insights into which intervention action would be most effective or appropriate. I like to use the example of the ‘deer caught in the headlights’ analogy: if you see a deer caught in your headlights while you are driving at night, you can predict with a high level of accuracy that you are going to hit the deer, and you do. Hit the deer, I mean. Nothing changes with the accuracy of the prediction.
What Are We Losing?
The other flaw in the construct of the employee churn problem statement is the notion that we are trying to ‘save’ the employee (from attriting). That’s not really the nature of the problem. The reality is we want to save the ‘stuff’ that is walking out the door along with the departing employee. That ‘stuff’ can be related to (a) productivity, (b) rare skills & competencies, or © institutional knowledge. The impact to the organisation is different for each of these loss types, and it is ultimately, a deep understanding of these risks that should be shaping the problem-framing around employee churn. Predicting the likelihood of losing these ‘stuff’ can provide information value to shaping a more nuanced and meaningful intervention approach. For example, someone who is highly productive may be more triggered by perceived unfairness in compensation (a push factor); someone who possesses rare skills & competencies may be triggered by external market forces paying a hefty premium (either in compensation or rank or title) for those said skills & competencies (a pull factor); someone who possesses valuable institutional knowledge may be triggered when they feel under-recognised (a push and pull factor).
It’s About Risk Management
So what should you do to solve the employee churn problem? If your intention is to intervene the outcome, then predicting the outcome isn’t going to be very helpful. Instead, diagnostic analytics should be the preferred course of action. Predictive models are based on correlations and often do not contain the causative elements as these data points are rarely ever captured systematically. The ability to understand the nature of the risks to the organisation that a departing employee poses must be the starting point for any economically meaningful conversation on employee churn. Diagnostic analysis then allows you to figure out the ‘journey of discontent’, the ‘moments of truth’ where the balance tips for each of these risks types. For example, a highly productive employee can tolerate an over-bearing boss but they might not appreciate that rare moment of public beration. A trigger-based solutioning would then be more appropriate versus an outright predictive model.
Knowing the risks associated with the employee, you should also put in place contingency plans assuming the employee will walk. This will help to de-risk the situation. Can and should you find a replacement when the employee leaves? Can you scale up those rare skills & competencies to make them less rare? If such contingency plans are in play, then it makes better sense to develop the churn predictive model as a decisioning input.
Conclusion
I have been guilty of this bad thinking and bad practice too. I am not immune to it. Over the course of my banking and consulting career, I have engaged in the construction of a variety of churn models, for both customers and employees. HR clients continue to request for the building of employee churn models and it’s hard to convince them otherwise — the employee churn model has a particular allure in signalling the adoption of data science for HR. Bringing a diagnostic angle to the conversation is essential, but obviously entails much more ground work. It’s important to make stakeholders aware that intervening the outcome is so much harder than exploiting it when it comes to predictive models. Re-framing the attrition problem statement as a risk management one can help open up new perspectives, bringing the focus on what to do with the model output as opposed to what is the model input telling us. These cognitive efforts will pay dividends in the long run.