I’ve been in the business of developing and implementing business applications for over 30 years. Data quality is a perennial problem and indeed cited recently by Gartner as the cause of an average $15m cost to US businesses in 2018. When I read an article recently that suggested there were three primary causes of poor-quality data in an organisation it got me thinking…The article suggested the 3 reasons for poor quality data were…
- Data transfers from legacy systems – absolutely, yes, I agree!
- Company mergers – a bit of (1) perhaps, but undoubtedly a major cause.
- User error (from data entry) – true…
Surely, that can’t be it? So I thought of a few more that I’ve come across…
- Defaulting. The dreaded practice of defaulting an answer on a form to a value that may not be correct for that person. A classic is the date of birth defaulting to 1/1/1900 (everyone is 118 years old!)
- User training – give the poor data entry person a break, blame it on their line manager for not coaching them effectively on what they should be entering.
- Lack of standardisation – particularly true if you’re importing or exporting data. The data is entered and may be accurate and complete at source, but does not comply with the standard
- A variation of user error is customer error, especially now we have digital channels and self-service.
Can you think of any more?
#dataquality #infoboss #clearviewbusiness
For more information download our Best Practice Guide To Data Quality Management