There are various tools for the management of data warehouse quality are as follows −
A definition and quantification of quality are given, as the fraction of performance over Expectance. It is used to define quality as the loss communicated to society from the time a product is consigned. The complete loss of society can be considered as the sum of the producer’s loss and the user’s loss.
It is familiar that there is a tradeoff between the quality of a product or service and a production cost and that an organization should discover an equilibrium among these two parameters. If the equilibrium is lost, then the organization fails anyway.
There are multiple research has been completed in the area of data quality. Both researchers and practitioners have faced the issue of improving the quality of decision support systems, generally by ameliorating the quality of their information. It can show the related work on this area, which more or less affected our method for data warehouse quality.
The framework includes seven elements adapted from the ISO 9000 standard such as administrative responsibilities, services and assurance cost, research and development, production, distribution, personnel administration, and legal service. This framework reviews an essential part of the literature on data quality, yet only the research and development form of data quality appears to be consistent with the cause of data warehouse quality design.
There are three main problems included in this field are analysis and design of the data quality method of data products, design of data manufacturing systems (DMS’s) that combine data quality methods, and definition of data quality measure.
A data quality system surrounds the organizational structure, responsibilities, processes, and resources for achieving data quality management. Data quality control is a set of operational methods and activities that are used to obtain the quality needed for a data product. Data quality assurance contains all the prepared and orderly services essential to support adequate confidence that a data product will satisfy a given set of quality specifications.
The quality of the data that are saved in the warehouse, is not a process by itself. It is affected by all the processes which take place in the warehouse environment. There are several data quality factors are as follows −
The completeness factor defines the percentage of the interesting real-world data introduced in the sources and the warehouse.
The credibility factor defines the credibility of the source that supported the data.
The accuracy factor defines the accuracy of the data entry procedure which appeared at the sources.
The consistency factor defines the logical coherence of the data concerning logical rules and constraints.
The data interpretability factor is concerned with data definition (i.e. data design for legacy systems and external records, table definition for relational databases, primary and foreign keys, aliases, defaults, domains, description of coded values, etc.)