20-06-2023
Numerous organisations have jumped on the bandwagon of exploring People Analytics due to the benefits promised – workforce insights that enable better decision-making, improved employee experience, organisational performance and many other business outcomes. Yet what is often overlooked is the fact that a solid base of data quality must be in place for People Analytics to live up to its promise. Data quality is a tricky subject that HR departments struggle with; especially as they often lack clear standards for how the data is collected, stored or migrated between different systems.
If you find yourself unsure where to begin or simply wish to deepen your understanding, you’re in luck: we are here to guide you. In this article, we explain the concept of data quality and why we should bother with it at all. After reading this, you will have a clear understanding of 3 important dimensions of data quality and how they connect to People Analytics. And for those of you wondering how to put all this into practice, have a look at how Quintop can help you with your data quality journey.
Data Quality: Why bother?
Data quality is the extent to which data is error-free and fit for its intended purposes. The basic promise of People Analytics to provide quality insights about the workforce rests on the quality of the data being analysed. Data quality is thus a critical aspect of implementing People Analytics in an organisation. If poor-quality data is used, the reliability and validity of the insights gained from Analytics suffer. A common saying in Analytics is “garbage in, garbage out”.
This in turn can result in two scenarios: one possibility is if people are aware that the data is of low quality, they will be less likely to use the insights gained to guide their decisions. This hurts the organisational adoption levels of People Analytics and can lead to scepticism and resistance towards the use of data-driven insights in the organisation. Even with high quality data, a low adoption level is already one of the issues companies struggle with the most when implementing People Analytics. On the other hand, if people believe the data quality is adequate when it is not, they may unknowingly use the insights generated from that data and make misinformed decisions, which can prove extremely costly for everyone involved.
3 Dimensions of Data Quality
There are myriads of ways to measure data quality, which can mention up to 10 dimensions of data quality, but for the purpose of brevity in this article we will focus on the 3 most recognised ones. These are: accuracy, completeness and consistency.
- Accuracy
Data is considered accurate when it closely reflects reality and is free of errors. In HR, inaccurate data can take the form of misspelled contact details, faulty employee attendance records or wrong employment end dates. To illustrate the impact on the use of People Analytics to inform decision making: having inaccurate employment end dates in HR systems will result in an incorrect turnover rate. Turnover rate is one of the most important metrics needed to keep track of organisational health levels. Using an inaccurate rate can obscure the fact that there are issues making people leave which require investigation. - Completeness
Data is considered complete when all the required information is present, and no important details are missing. Incomplete data manifests in HR systems, for instance, when fields are skipped or when fields for needed information do not exist. To illustrate the impact on People Analytics: if a company is running a program to improve diversity and inclusion, yet recruitment data about diversity and inclusion is missing, it can be difficult to track progress toward hiring goals. - Consistency
Data is considered consistent when it is the same or equivalent across different source datasets. Inconsistent data can take the form of different values for the same metric, employee, or department. A very popular example is FTE count. There are many ways to define how much time an employee works, and whether that employee is part-time or full-time. For instance, in their definition of what constitutes FTE count, the finance department may exclude long term sickness or maternity leave, while other functions do not. This results in different versions of the truth, where different departments have different FTE values. As FTE is a very important metric informing decisions such as budgeting or hiring, this can lead to confusion and inter-departmental conflict.
Factors contributing to data quality loss
Manual data entry errors. Humans make mistakes, therefore some errors will be unavoidable. The good news is however that the impact of such errors is of a much smaller magnitude than the other factors outlined below.
Insufficient data management skills or awareness of the company’s data management process. Perhaps employees require up-skilling in data management or are otherwise unaware of the established organisational data management process. Training will then resolve these issues.
Lack of established data definitions. When analysing data, it should be clear and unanimously agreed at an organisational level what exactly is meant by the metrics used. Companies often use different formulas to mean the same concepts and differences can even exist under the same “roof”– between departments (see FTE example above), teams or colleagues of the same company. It is important to have a data dictionary or glossary in place where all concepts are defined and accessible.
Lack of explicit data management governance & process. Conscious effort must be paid to how data is managed, otherwise it is unlikely that data quality will simply “happen”. It is thus essential to have a data governance framework in place to ensure that data is treated as a strategic asset. This includes standards and policies for proper, compliant data handling. Further, it includes data management processes capturing the actions needed to maintain those standards throughout the data lifecycle. Importantly, it also specifies who is allowed to request, decide or deploy changes in data definitions, process, or the governance framework.
An ineffective data management process. Perhaps the process is there, but it contains steps which damage data quality. For example, a mandatory field for employee birthday must be filled in at the time of onboarding a new employee, but this information only becomes available later on. As a workaround, HR personnel fills in dummy values (e.g. 01-01-1990) and must later return and update the information in the system with the correct information. But sometimes that second step is never taken, and mistakes slip into the system. This kind of issue requires a change in the process (and in the system configuration).
What Quintop can do for you
While the list of factors is not exhaustive, we can help you with the factors outlined above, and more. A possible solution for data quality involves first understanding your current situation by evaluating the People Analytics maturity of your organisation (including your data quality levels). Based on this assessment, we create an action plan that suits your unique situation to help you reach the desired levels of data quality. Then depending on the action plan, we will assist you with one or more of the following:
- Create data definitions & a data dictionary
We will facilitate workshops with all key stakeholders for creating organisation-wide agreed data definitions which are then captured in a data dictionary (or glossary). - Perform a one-off data cleansing
We will also help you in the actual cleansing of the data, of course. One option is a one-off data cleanse that improves the quality of one or more datasets at one point in time. This is effective, for instance, when transitioning to a new system, under the condition that in the new system data will be managed the right way. However, in order to avoid that a one-off cleansing becomes a regular activity, it is important to sustain the quality level achieved after the cleansing by putting in place the right processes and controls. This ensures a structural improvement of data quality, as it addresses the root cause and not just the effects. - Create a data governance and process for data management
We will also help with the design of a data management governance framework, with clear data standards, policies and processes. The final solution is in line with best practices, compliant with laws and regulations and tailor-made to best suit your organisation’s needs. - Training for building data literacy and/or process awareness
We will support your organisation in developing HR professionals’ skills and capabilities by facilitating training sessions and workshops aimed at improving data literacy and/or knowledge of the organisational process. - Set up data quality dashboards
Data quality is a moving target that must be regularly monitored for the best results. By setting up a data quality dashboard, doing so becomes much more accessible. Furthermore, adding steps for monitoring data quality to your data management process is a sure-fire way to make sure your data stays healthy over the longer term.
So, do you feel like your organisation could use a refreshment on their people analytics after reading this article? At Quintop HR Consultants, we are excited to assist you! We provide end-to-end support in developing and rolling out your people analytics strategy, from the initial design of your data-driven HR strategy to the implementation phase and beyond. Contact us today and let’s tackle this challenge together.