In my January 4 blog, I discussed timeliness and accuracy as components of data quality. Last week I delved into two more aspects of data quality – accuracy and credibility. Finally today I want to conclude this mini data quality series by looking at some success stories and results when an organization has good data quality. I’m a firm believer that we can learn best practices from others rather than reinventing the wheel. Let’s begin by learning from Banco Popular, Canon, Iceland Foods and the UK’s Ministry of Defence. Full disclosure – these stories are from Trillium Software but were chosen because they had concrete data quality results. Success stories were a lot harder to find than the bad data quality stories.
Banco Popular has over 2 million customers and is the largest bank in Puerto Rico. With their growth over the years, their data system has taken a toll. More non-standardized account data was added to the system, resulting in decreased employee access to information and sluggish response times. As many as 50% of the employees use them in their operations. Banco Popular needed to create an accurate database to improve customer service and reduce waste. They cleaned the data and removed duplicate records, which resulted in a more accurate view of their customers and saved $70,000 each month in direct customer mailing costs.
Canon Europe’s Consumer Imaging Division, a subsidiary of Canon Inc. of Japan has operations in 18 countries, selling and distributing cameras, printers and fax machines. In 2001 they centralized operations with a new system that had about 500,000 records, with a varying degree of duplication from country to country. In addition, Canon receives 200 data feeds each week from its distributors. Using data cleaning software, they identified 20% of their records as duplicates and merged them automatically. The better data quality saved them time over a manual process of data cleaning and gave us better customer insight.
Iceland Foods is one of the top 10 grocery chains in the UK with over 700 stores. They had applied basic data cleaning processes from 1999 until mid-2008 for home delivery and marketing purposes. In October 2008 the company began a nationwide bonus card program and recognized the need for a different data quality approach. Within six months of implementing the new data cleaning software, they reduced data duplication by 13% allowing them to personalize the grocery shopping experience and increase competitiveness.
The Defence Logistics Organisation in the UK supports the Royal Navy, Army and Air Force using 860 different systems. Analysts estimated it would take 100 staff about 178 years to manually validate all of its data. Using data quality software, they were able to check over 1.7 million records in just over one hour, which is a huge staff resource savings. In addition, the Royal Navy was able to save £20million (about $36 million USD) by identifying and correcting one attribute for 56,035 rogue data entries in its inventory management system.
The story of good data quality is not told enough. I believe that good data quality is achievable within any type of organization. All four of these examples faced similar challenges regarding duplicity in their data. These success stories from around the world were able to identify their data problems at least in part and then measure the results and positive impact from cleaning their data.