Chapter 8. Master Data Management
This chapter defines important principles and data structures in master data management architectures used for mastering data volumes. It illustrates the alignment and synergy of data governance and master data management and discusses how they support data quality engineering. I define master data management (MDM) as the process of establishing and implementing the architectures, standards, processes, policies, and tools used to define and manage critical data in order to provide a single mastered volume of validated, approved, and certified data to business functions across the firm. Mastering data means the data is organized into domain-specific volumes where relevant data quality validations and anomaly detection techniques have been used to certify and approve it for use.
The financial industry has recognized the proliferation of highly distributed independent silos of data, and the use of a single, large, centralized data store does not optimally support best practices in data architecture and data management. Instead, the industry recognizes that natural collections of data such as security, reference, holdings, transactions, prices, client accounts, performance, and so on each have unique architecture, quality, and data management requirements for database and file structure implementations, data retention, and data access.
There are many different variations and definitions of master data management, just as there are many ...
Get Data Quality Engineering in Financial Services now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.