Data Quality
Inaccurate or inconsistent data can cause a company to make poor decisions that could be reflected in reduced earnings, increased expenditures and overall poor performance. The purpose of a data quality effort is to improve and maintain the completeness, consistency and usability of an organization’s data. Data is considered high quality if it fits the intended uses and if it is correctly represented. Companies often cannot rely on the information that serves as the very foundation of their primary business applications, diminishing the value of this critical asset. It is imperative that organizations pay close attention to data quality, and the Data Quality Portal is the place to begin your exploration of this important topic. Read our cogent articles, use our white papers for your data quality research and explore the books and other resources you will find in this portal.
- Implementing Data Quality Through Meta Data (part 1 of 2)
- How are you addressing the single most difficult problem facing data warehouses today? Data Quality. When the quality of data is compromised incorrect interpretation and use of information from your data warehouse can destroy the confidence level of its customers, YOUR users. Once the user’s confidence in your warehouse is eroded it is a question of time before your system will no longer exist.
- Implementing Data Quality Through Meta Data (Part 2 of 2)
- This article is the concluding portion of a two-part series on implementing data quality through meta data. The first installment examined the role meta data can have in the data warehouse model and data acquisition designs for information content and quality. This segment will examine real world examples of technical meta data tags that can be incorporated into your designs to facilitate measurement of data quality and promote user confidence in the informational content of the warehouse. This meta data provides a semantic layer of knowledge about the information in your warehouse that is highly valuable to both business users and IT (information technology) development staff.
- Data Integrity in a New Light
- One of the main areas of responsibility for any data steward is the enforcement of data integrity. Most data administration texts define data integrity as “attention to the consistency, accuracy and correctness of data stored in a database or other electronic file” (Watson, R., “Data Management”, Wiley, 2000). Commonly, data integrity refers to the validity of data in its incarnations (electronic, paper, etc.). This approach is primarily a reactive one and is focused on the rules used to create and store data values, by creating and storing the “right” values for each data element.
- Deborah Poindexter: Meta Data Driven Enterprise Data Management – Future or Fantasy?
- Is your data out of control? Is your enterprise data management fragmented and inconsistent? Are you unable to answer these questions? If the answer to any of these questions is “Yes”, then Enterprise Data Management may be a discipline you should investigate. Enterprise Data Management (EDM) is continually gaining acceptance as a critical function of IT, since data is the foundation of all business decisions.
- “Data Cleansing: A Dichotomy of Data Warehousing?”
- “Dirty data” is such a pervasive problem in every company, in every industry! But we have lived with “dirty data” for decades, so why is it such a problem now? Is it because we promise to deliver data warehouses with “clean, integrated, historical data in a short time frame for low cost”, yet we are unable to deal with the preponderance of “dirty data” within the framework of this promise? Some data warehouses are failing because the promised “clean”, “integrated”, “historical” data could not be delivered. Others are failing because the promised “short” time frame and “low” cost were exceeded in the attempt to clean up the data. In other words, they are failing because of the dichotomy of our promise.