The critical role data quality plays in analytics
Ashleigh Dickson, Business Development and Data Service Management at PBT Group
A fast-paced digitally driven world characterised by consistent technology evolution and change, is certainly an exciting space for business to operate in. Digital presents ample opportunities that were once impossible to even imagine, yet now worth exploring – not just to gain a competitive advantage, but for renewed business growth and plausible strategy.
And those involved in the data space have the privilege and excitement of being right in the middle of all this digital action. As digital evolves, the hunger for data grows as businesses seek to leverage its value as an asset for strategic growth.
One way of achieving this is to action your “reliable” data through the investment in data analytics. To my mind, the role data analytics is playing in business strategy is somewhat obvious. In fact, research forecasts that the global data analytics market will achieve a CAGR of 30.8 percent through 2023, reaching a market value of $77.64 billion by the end of the period.
Effectively analysing data can produce insights the business did not previously hold. Such insights can be critical to a business’s future planning, key decisions and overall sustainability and success. As such insights support a business to make strategic enhancements, as well as understand and therefore target customers better, resulting in improved overall outputs and likely higher retention rates. These are all of the good things we now know and have applied in our day-to-day business engagements with varying degrees of maturity. However, the challenge remains to gain the benefits from the investments made.
Whilst there are a variety of business processes, analytics models, governance and other factors that contribute to value of data analytics, the key to reaping such benefits does not lie in the analytics alone, but also in the quality of the data being analysed. Businesses must understand that the impact of data quality to the success of analytics is critical. If the integrity of the data being used is questionable, the end result may not target the right outcome, which can negatively impact the credibility of the analytics within the organisation and jeopardise the customer relationship. In the case of customer analytics, it’s fair to say that no customer today wants to be contacted about a possible offering that doesn’t meet a current need, or worse, about an offering from the same provider they already have that offering from. And while in some cases the data may not always be perfectly accurate, there are some important aspects that should be taken into consideration to ensure sound data quality for the purpose of analytics and to achieve the right value-based results.
The business data foundation
Data can come into a business from many different sources and in varied structures. Considering this and the fact that not all data is of sound quality, a business must have a solid data focused foundation in place, with the right technology infrastructure, that can effectively sort, manage, and store the good data, for the purpose of analysis. A strong data foundation also ensures that aspects such as data governance and security, related to data, are addressed in support of achieving quality data status.
Data quality tools and loading process
There are a number of tools within the data quality framework that can form part of a data strategy, that businesses can use to assess the quality of the data, and if it is indeed up to date and of good value. However, a consideration that must be addressed here is the data loading process – the time it takes for the data to be loaded from the source to the point that it is being served for the purpose of analytics. Data quality tools are only as effective as the actions that come out of the detections. The data quality tool will in most cases not fix data latency or loading issues but identify them to the relevant stakeholders.
To mitigate this risk to data quality, businesses need to ensure that there is monitoring and tracking around all the different levels of the data, through the whole ecosystem, and up until the data gets loaded into the data quality tools. This will ensure that the analytics that comes out of the data is effective and valuable. If there are gaps in the data, these will be identified within this process and the data sent back or removed, thus ensuring that the data ultimately used is of good quality and analytics is therefore comprehensive.
Of course, with the opportunities that analytics can provide, there is often an urgency to ensure that the analytics takes place within a quick timeframe. If there is a gap between the time the data is loaded into the system versus the timeframe the business needs to do the analytics, and if the analytics has to commence to meet business needs, there is the risk of the analytics occurring on data that is not yet fully ready, or on data not fully complete. To avoid this, a business must ensure the data timeframes are in place, and in line with the various business requirements and objectives.
It should be noted as a caveat that the above suggestions refer to analytics strategies and builds that consider larger data sets, such as predictive analytics for example, and do not refer to event-based analytics, where historical data quality might not be as important.
With that in mind and while certainly any business can, and will, find value and benefit from data analytics, it is important for analytics to take place on data that is of sound quality, to ensure viable and value driven results. Investing in analytics without investment into data quality is meaningless and won’t garner the desired returns the business will want to see from such an approach. Rather, taking the abovementioned aspects and suggestions into consideration will better support a data analytics strategy that derives real value for the business.