Cleaning Up Your Act

Cleaning Up Your Act

Companies relying on poor quality data will inevitably pay a high price by way of economic damage springing from poorly premised decisions, lost opportunities, bad publicity and risk to reputation.

Redundant data, wrong data, missing data, miscoded data. Every company has some of each, probably residing in IT nooks that don't communicate much. It's not a new problem, but these days the jumble becomes very apparent during high-profile projects, such as installing CRM or supply chain management software.

Reader ROI

  • Understand why poor data quality is no longer a little problem
  • Find out how costly dirty data can be
  • Buy-in from CXOs is a must for instituting a data quality program

For many companies the trustworthiness of the data they rely on to keep them in business remains a total mystery, forcing them either to base decisions on guesswork, or pushing them to a state of virtual paralysis where they feel they can make no effective decisions at all. Some companies make wildly optimistic assessments of the validity of their data, which as far as potential harm goes, can be even worse.

In today's information economy, knowledge has become such a critical asset that the quality of a company's data is even considered a reliable predictor for its future success. Companies relying on poor quality data will inevitably pay a high price by way of economic damage springing from poorly premised decisions, lost opportunities, bad publicity and risk to reputation.

The Data Warehousing Institute in Seattle has found low quality data costs US businesses $US611 billion per year in bad mailings and staff overhead alone. A 2003 IDC survey of 1648 companies implementing business analytics software enterprisewide found data cleanliness and quality came second only to budget cuts on the list of problems cited. And just 23 per cent of 130 companies surveyed by Cutter Consortium on their data warehousing and business intelligence practices use specialised data cleansing tools.

Achieving data quality is not unlike trying to hold an eel - it is slippery, will not stay still and seems determined to slide out of one's grasp. Consulting firm Hildebrandt International has found data quality decays at an average rate of 33 per cent annually, and without proper attention even "clean" data can become incorrect, unusable and ultimately untrustworthy. That means no company can afford to entirely trust the data it relies on at any one point in time. Yet according to some experts, far too few companies recognise the real data quality issues they face - above all, that when it comes to data, "oils ain't oils", as it were.

"I think a lot of organisations take their data as being sacrosanct in some way," says professor Graham Pervan of Curtin University School of Information Systems. "And of course [the validity of that approach] depends where you're getting the data from. There's data that comes internally from your company, and there's data you can acquire from outside - very often at great expense - and it's not all the same. You need to look with great caution at where the data is coming from and how reliable it is. People are very good at making predictions, but a prediction without error bars on it saying how reliable it is, is not a great deal of use."

The importance of "golden" data - data than has been cleansed, consolidated and proved 100 per cent accurate - cannot be underestimated. Yet despite enormous advances in technological remedies, the data quality problem seems more pervasive and tenacious than ever.

"Every company has data quality issues whether they address them or not," says Frank Block, director for finance solutions, Insightful Corporation. "And the need to integrate disparate data sources creates an avalanche effect. Smart companies recognise the need and apply analytic solutions to access, analyse and report data quality information using a predefined framework." Block says too few companies realise the positive bottom-line benefits that come with improvements in data quality. "While the costs of poor data quality can be steep, the benefits of clean and reliable data are even greater," he says.

Three quarters of respondents to the PricewaterhouseCoopers Global Data Management Survey 2001 reported that investment in effective data management had delivered improved bottom-line results across their business. Almost 60 per cent of respondents had cut processing costs, and well over 40 per cent had managed to boost sales through better analysis of customer data. However, the latest advances in - and thinking on - data quality contain both good news and bad news.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ACTAndersenAndersenArthur AndersenBillionCornerstoneCurtin UniversityCutter ConsortiumEnronHISIDC AustraliaInsightfulInterface SoftwarePricewaterhouseCoopersPricewaterhouseCoopersSpeedWhereNetZurich Financial Services Australia

Show Comments