The 5 Basic Concepts of Data Quality Assurance
While in our previous posts we demonstrated the importance of data quality, this time we shed light on some basic concepts of data quality assurance.
Data or Information Quality Assurance
Data or information quality assurance involves applying proven quality management principles, processes, and practices to information, viewed as the “product” of corporate processes, to ensure it meets or exceeds the expectations of information consumers.
Total Information Quality Management (TIQM)
TIQM (formerly TQdM – Total Quality Data Management) is an internationally recognized information quality assurance methodology developed by Larry P. English to address data quality issues and their management.
The Role of Data Quality Assurance
According to Larry P. English, the role of information quality assurance is the following:
“Consistently meeting knowledge worker and end-customer expectations through information and information services.”
To better understand this definition, it is helpful to analyze its components:
- Consistently: If an information service—such as a price list—has varying quality over time, it will be unreliable. Quality information is characterized by its dependability, eliminating the need to “double-check” or verify the information.
- Meet: It is not always necessary to exceed expectations. Often, it is sufficient for information users if the information contains exactly what is needed, no more, no less.
- The expectations of those who work with information: Information quality is determined by its users. Quality criteria established by someone far removed from the information value chain have little significance. Within an organization, multiple individuals may work with the same information, necessitating harmonization of differing —possibly conflicting— expectations.
Information Quality as a Product Attribute
The above definition highlights several critical aspects:
- Information is a product that can be categorized as “quality” or “poor quality.” It is the result of business processes that regulate its creation and maintenance, whether stored in databases, on paper, or in other forms. In the age of information, data must be treated as a direct product, not as a byproduct of business processes as previously considered. Viewing information as a byproduct misplaces the focus, emphasizing the system rather than the ultimate goal: the information itself.
- Information is a product, so the same quality improvement principles applied to manufacturing processes can also be used to enhance the quality of information products.
- Information quality exists only in the context of its user. A product is valuable only if it fulfills the needs of its user. Similarly, information is valuable only if it helps knowledge workers perform their tasks or achieve their objectives. Users are the ultimate judges of information quality, as it depends on how well it supports their work. Shirou Fujita, CEO of NTT DATA (the first information services company to win the Deming Prize), remarked:
“In this industry [information systems], we too often emphasize technological advancements without considering user needs. Information systems must improve significantly to better serve societal demands.”
- Information exists as part of a supplier-user value chain. Suppliers are information producers, executing various processes to create information. Data intermediaries transfer data from one format—such as paper—to another, such as electronic form, acting as conduits between suppliers and users. Users can be internal or external knowledge workers, often occupying both supplier and user roles simultaneously. For instance, a loan officer evaluates credit data to determine creditworthiness, effectively producing new information based on existing data.
Responsibility for Data Quality
Every process owner or manager overseeing data production or maintenance processes is responsible for the quality of the data their processes generate—not just the physical products. The integrity of a process is inseparable from the integrity of its outputs. This responsibility extends to ensuring that the information serves the needs of all subsequent users in the information chain, not just those of the department producing it. Process managers must understand the aggregated information needs of all downstream knowledge workers in relation to the information products they manage.
Failure to do so forces downstream workers to identify or correct missing or incorrect data independently.
In our next blog post, we will detail the process of data quality assurance. Until then, please find a brief description of our data quality assurance solutions here.
Perhaps your company has already considered how to improve the quality of its data assets? Why not talk about this over a good cup of coffee?