Data Quality: Cost or Profit?


Market Overview

In the past year, TEC has published a number of articles about data quality. (Poor Data Quality Means A Waste of Money; The Hidden Role of Data Quality in E-Commerce Success; and, Continuous Data Quality Management: The Cornerstone of Zero-Latency Business Analytics.) This time our focus takes us to the specific domain of data quality within the customer relationship management (CRM) arena and how applications such as Interaction from Interface Software can help reduce the negative impact that poor data quality has on a CRM objective.

CRM is prone to more corrupt data than any other enterprise applications. Traditional back-office systems require a limited number of individuals to process data, whereas in CRM, almost everyone in an organization interacts with parts of the application. As a result, the probability of processing bad data increases. Ultimately, quality data is the foundation of successful CRM implementation and accurate customer intelligence. Past experience and research show that 50 to 70 percent of many CRM initiatives should be devoted to data quality. Consequently poor data quality hampers a company's ability to realize the return from their investment in a truly integrated CRM. Data quality, therefore, should not be considered a one-time exercise. It has to be integrated as a core element in managing a business.

Defective data quality leads to customer complaints and customer defection. Therefore, it is imperative to clearly define a standard for your data requirements and how these requirements should support specific business objectives. One common affliction of data quality is a result of the fast pace of business environments. Information is constantly evolving. People are constantly moving in and out of positions, and companies continually change their contact details. These results in a number of common issues involving

  • Incorrect or inconsistent collections of customer details

  • Duplicate records

  • Inconsistent synchronization between multiple databases

  • Multiple databases scattered throughout different departments or organizations, with data structured according to the particular rules of that database.

The duplication of data is the most common type of data quality issue. Customer record duplication is caused by a multitude of situations. It can occur from variations in spelling or assigning multiple addresses to the same customer profile. As a result, data becomes corrupted and is then misinterpreted. Having a customer in a database two or more times may create the impression that they are different customers. Businesses may lose money if, for example, they do a mailing campaign based on this customer base. Multiple letters sent to the same customer will double the cost of mailing and fulfillment and reduce company credibility in the eyes of customers.

The use of poor quality databases can also lead to the misinterpretation of a business operation by generating misleading customer analyses. Additionally, customer intelligence and customer behavior analyses are also used for fraud prevention and for customer retention. Therefore, unreliable data can lead to catastrophic results. As increasing evidence shows, business intelligence is key to CRM success and as demonstrated above, mining into poor quality databases has the completely opposite effect.

Market Winners

Alone, a stand-alone system is insufficient to tackle the underlying cause of poor data quality. Data quality should be considered a business issue and as such, businesses must create and institute enterprise-wide guidelines for data quality. A combination of people, process, and technology tools is required to establish data quality program.

With this said, an application's contribution to data cleansing remains a pillar to the overall success of a data quality strategy. Systems may implement a data cleansing method through the use of centralized data warehouses, the integration of data cleansing software, and the use of third-party data with the enterprise application. Overall, some of the available techniques are

  • Data-validating process rules

  • Centralized database

  • Data cleansing technology

  • Data scrubbing

Data scrubbing is the process of fixing or eliminating individual pieces of data that are incorrect, incomplete or duplicated before the data is passed to a data warehouse or another application. The aim of data scrubbing is two-fold: eliminate errors and redundancy, and bring uniformity to different data sets that may have been created with different or incompatible business rules. Data scrubbing can be considered a combination of technology and process. The business to business process (B2B) list provider Dun & Bradstreet (D&B) has developed a good example of a data scrubbing process. D&B has introduced a uniform coding methodology that associates each individual business with a D-U-N-S number. This number is unique and remains unchanged during the business' life span. D&B created this unique approach to reach and maintain uniformity within its own database. It also offers its customers the same service on a regular basis.

Another approach to achieve high-quality data is the use of program tools developed for managing data quality. These tools fulfill the objective of auditing, cleaning, and monitoring data. Companies may opt to develop tools in-house or to acquire a third party tool from a vendor specializing in data cleansing tools. A majority of data warehouse and business-intelligence vendors such as SAS Institute, Informatica, Experian, and Group 1 Software provide data cleansing options that sit on top of their database. FirstLogic and Ascential Software, which also have this feature, are considered by the market to be among the data quality oriented vendors.

Market Challenger

Not many small to medium CRM providers endorse their ability to manage data quality. To the contrary, however, Interface Software has taken responsibility for its share of data quality. The Illinois, US-based CRM vendor specializes in relationship intelligence for professional services firms (see previous article: Professional Services Are Catching-up With CRM).

InterAction version 5.1 provides an out-of-the-box data quality synchronization tool. The centralized database is closely monitored by data stewards to ensure that changes made to the database are appropriate and accurate. The system reduces data change management by controlling the flow of bad data. This is accomplished by a workflow process allowing data stewards to efficiently prioritize, evaluate, and act upon contact changes. The workflow plays an important role in the submission process by directing all changes to the correct data steward for their review.

Another key InterAction data quality feature is the Contact Verifier. When necessary, this integrated tool extracts necessary key business information about all contacts and submits it to selected contacts for verification. The Interface Data Change Management functionality allows both professional and administrative users to control access over viewing and editing their contacts. This feature protects records from unwanted or incorrect changes.

During our last meeting, Rick Klau, Vice President of Vertical Markets and John S. Lipsey, Director Corporate Communication, emphasized that the 2003 objective for Interface Software was to enhance data quality features: "This is very important to our client base." stated Rick Klau. Reducing data management, enhancing the control of bad data through the workflow engine, and keeping track of data disparity were key mottos for 2003.

User recommendation

Data quality should remain a central focus for any business seeking serious return on investment (ROI) and efficiency. The right methodology is probably a combination of technology and procedures. However let's not deviate from the main objective of a packaged application. Interface customers select their software primarily for its CRM process automations and functions. As such they are also entitled to yearly functional improvements. Interface has promised the delivery of additional vertical modules and enhanced features for 2004. Let's stay tuned to see how these enhanced features perform.

comments powered by Disqus