Master Data Management
()
About this ebook
In 1941, a new term was added to the Oxford English Dictionary: Information Explosion (Press, 2013). The term explains the growth in information content seen seven decades ago, beginning with Fremont Rider; a university librarian who in 1944 estimated that information in university libraries would double in size every sixteen years. Nearly seventy year later, Bounie and Gilled, produced a report called “International Production and Dissemination of Information”, and concluded that the world in 2008 produced 14.7 exabytes of new information; three times the amount of information produced just five years earlier (Press, 2013). This rapid growth in information content has generated a greater need for organizations to evaluate how key organizational content is managed to achieve strategic goals and to remain competitive in today’s business environment.
Binayaka Mishra
Binayaka Mishra is an experienced IT professional, in various tools and technologies like Data Warehousing, BigData Analytics, Cloud Computing, Reporting Analytics & Project Management documentation with 14+ years’of experience. He was Graduated in Computer Science & Engineering from National Institute Of Science & Technology, Berhampur, Odisha, India in 2002.He has worked in several critical roles with MNC’s like Tech Mahindra, Oracle Corporation, Wipro Technology,CapGemini UK,CapGemini India Pvt Ltd, UBS , AoN Hewitt Associates India Pvt Ltd, HMRC -UK and TUI Travel Plc -UK. Apart from technical details, his mastery are into functional domains like Payroll Processing, Tax Calculation, UK NI, BFSI,Telecommunication, Corporate Tax measurements divisions, Investment Banking, Automotive, Asset management , Security and Travel & Tourisim.Currently working as a Solution Architect / Project Manager in Tech Mahindra, India, loves to listen to music, play snooker, Bowling and a desperate swimmer like a shark.More Information could be found about him in his Linkedin Profile : https://www.linkedin.com/in/binayaka-mishra-b09612142/For any comments or advise, please feel free to write to: mishra.binayaka.18005@gmail.com
Read more from Binayaka Mishra
Digital Technology: The World Of Our Own Rating: 0 out of 5 stars0 ratingsOpen Source ETL-The Prodigy Kids Rating: 1 out of 5 stars1/5The World Of Agile:Incarnation Of DevOps Rating: 0 out of 5 stars0 ratingsCloud Computing: Reign Of Access Rating: 0 out of 5 stars0 ratingsOpen Source Database: Virtue Or Vice? Rating: 0 out of 5 stars0 ratingsBigData Analytics: Solution Or Resolution? Rating: 3 out of 5 stars3/5Open Source Software: The Beginning Of A New Era Rating: 0 out of 5 stars0 ratings
Related to Master Data Management
Related ebooks
The Data Governance Imperative Rating: 0 out of 5 stars0 ratingsThe Case for the Chief Data Officer: Recasting the C-Suite to Leverage Your Most Valuable Asset Rating: 4 out of 5 stars4/5Data Management Rating: 0 out of 5 stars0 ratingsModern Enterprise Business Intelligence and Data Management: A Roadmap for IT Directors, Managers, and Architects Rating: 0 out of 5 stars0 ratingsThe IBM Data Governance Unified Process: Driving Business Value with IBM Software and Best Practices Rating: 4 out of 5 stars4/5Master Data Management Rating: 4 out of 5 stars4/5Selling Information Governance to the Business: Best Practices by Industry and Job Function Rating: 0 out of 5 stars0 ratingsEnterprise Business Intelligence and Data Warehousing: Program Management Essentials Rating: 4 out of 5 stars4/5Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information<sup>TM</sup> Rating: 3 out of 5 stars3/5IBM InfoSphere: A Platform for Big Data Governance and Process Data Governance Rating: 2 out of 5 stars2/5Managing Data in Motion: Data Integration Best Practice Techniques and Technologies Rating: 0 out of 5 stars0 ratingsThe Practice of Enterprise Architecture: A Modern Approach to Business and IT Alignment Rating: 5 out of 5 stars5/5Spreadsheets To Cubes (Advanced Data Analytics for Small Medium Business): Data Science Rating: 0 out of 5 stars0 ratingsBig Data Analytics for Creative Marketers: Money Spinner Rating: 3 out of 5 stars3/5Learn Data Warehousing in 24 Hours Rating: 0 out of 5 stars0 ratingsData Analytics. Fast Overview. Rating: 3 out of 5 stars3/5The Data Model Resource Book: Volume 3: Universal Patterns for Data Modeling Rating: 0 out of 5 stars0 ratingsData Virtualization: Selected Writings Rating: 0 out of 5 stars0 ratingsIntegration Architecture Rating: 5 out of 5 stars5/5Understanding Big Data: A Beginners Guide to Data Science & the Business Applications Rating: 4 out of 5 stars4/5Building Big Data Applications Rating: 0 out of 5 stars0 ratingsBusiness Intelligence Guidebook: From Data Integration to Analytics Rating: 4 out of 5 stars4/5Big Data: Opportunities and challenges Rating: 0 out of 5 stars0 ratingsBuilding a Scalable Data Warehouse with Data Vault 2.0 Rating: 4 out of 5 stars4/5Modelling Business Information: Entity relationship and class modelling for Business Analysts Rating: 0 out of 5 stars0 ratingsSmarter Data Science: Succeeding with Enterprise-Grade Data and AI Projects Rating: 0 out of 5 stars0 ratings
Enterprise Applications For You
Bitcoin For Dummies Rating: 4 out of 5 stars4/5Systems Thinking: Managing Chaos and Complexity: A Platform for Designing Business Architecture Rating: 4 out of 5 stars4/5Excel : The Ultimate Comprehensive Step-By-Step Guide to the Basics of Excel Programming: 1 Rating: 5 out of 5 stars5/5Creating Online Courses with ChatGPT | A Step-by-Step Guide with Prompt Templates Rating: 4 out of 5 stars4/5Scrivener For Dummies Rating: 4 out of 5 stars4/5ChatGPT Ultimate User Guide - How to Make Money Online Faster and More Precise Using AI Technology Rating: 0 out of 5 stars0 ratingsQuickBooks 2024 All-in-One For Dummies Rating: 0 out of 5 stars0 ratingsThe Ridiculously Simple Guide to Google Docs: A Practical Guide to Cloud-Based Word Processing Rating: 0 out of 5 stars0 ratingsCreate Income through Self-Publishing: An Author's Approach on Generating Wealth by Self-Publishing Rating: 5 out of 5 stars5/5Essential Office 365 Third Edition: The Illustrated Guide to Using Microsoft Office Rating: 3 out of 5 stars3/5Enterprise AI For Dummies Rating: 3 out of 5 stars3/550 Useful Excel Functions: Excel Essentials, #3 Rating: 5 out of 5 stars5/5Excel 2019 For Dummies Rating: 3 out of 5 stars3/5The New Email Revolution: Save Time, Make Money, and Write Emails People Actually Want to Read! Rating: 5 out of 5 stars5/5Excel Formulas That Automate Tasks You No Longer Have Time For Rating: 5 out of 5 stars5/5Excel Formulas and Functions 2020: Excel Academy, #1 Rating: 4 out of 5 stars4/5QuickBooks 2023 All-in-One For Dummies Rating: 0 out of 5 stars0 ratingsQuickBooks 2021 For Dummies Rating: 0 out of 5 stars0 ratingsExcel 2016 For Dummies Rating: 4 out of 5 stars4/5Learn SQLite in 24 Hours Rating: 0 out of 5 stars0 ratingsExcel : The Complete Ultimate Comprehensive Step-By-Step Guide To Learn Excel Programming Rating: 0 out of 5 stars0 ratingsData Governance: How to Design, Deploy and Sustain an Effective Data Governance Program Rating: 4 out of 5 stars4/5Evernote Essentials Guide (Boxed Set): Evernote Guide For Beginners for Organizing Your Life Rating: 3 out of 5 stars3/5Microsoft Outlook 2016/2019/365 User Guide Rating: 5 out of 5 stars5/5
Reviews for Master Data Management
0 ratings0 reviews
Book preview
Master Data Management - Binayaka Mishra
Chapter 1: Shaolin Tale
When the matter comes about KungFu, what comes first in all our mind is Shaolin Masters those who responsible to create the most devastative and disciplined method on the world since ancient civilization to combat the odds in the life. If we will go down one step further from the pit of the Shaolin Temple where KungFu was actually invented, it originated and was developed in the Buddhist Shaolin temple in Henan province, China. During the 1500 years of its development, Shaolin kung fu became one of the largest schools of kung fu. Likewise, MasterData Management AKA MDM is the technology discovered by, CDI [Customer Data Integration], ERP[Enterprise Resource Management] & PLM[Product Lifecycle Management] being the masters of data management.
In 1941, a new term was added to the Oxford English Dictionary: Information Explosion (Press, 2013). The term explains the growth in information content seen seven decades ago, beginning with Fremont Rider; a university librarian who in 1944 estimated that information in university libraries would double in size every sixteen years. Nearly seventy year later, Bounie and Gilled, produced a report called International Production and Dissemination of Information
, and concluded that the world in 2008 produced 14.7 exabytes of new information; three times the amount of information produced just five years earlier (Press, 2013). This rapid growth in information content has generated a greater need for organizations to evaluate how key organizational content is managed to achieve strategic goals and to remain competitive in today’s business environment.
For more than a decade, organizations have adopted a number of different approaches to data integration; from Data Warehousing in the early-to-mid 1990s, striving to achieve informational integration, through to ERP in the mid-to-late 1990s, focusing on operational (process and data) integration. Organizations have expected enterprise technologies to provide real tangible business benefits, with buzzwords like ‘integration’, ‘collaboration’ and ‘optimization’ proposed to ensure definite success. As a result, organizations around the world invested billions in Data Warehousing and ERP initiatives specifically; unfortunately, this confidence in technology was misplaced, where only a very small number of implementations were successful. We argue that the most important factor for the emergence of MDM has been the unrealized benefits in previous ERP implementations and unresolved Informational IS requirements. Indeed, these previous approaches to integration have facilitated the emergence of MDM which is set to define the organizational landscape for the next five years or so (a fashion cycle) as the solution to the data and information integration problem. In the following sections we present a brief historical account of organizations’ approaches to data integration, namely: Data Warehousing, ERP and ERPII/BI.
Reflecting on the early-to-mid 1990s Data Warehousing can be described as an informational solution to an operational problem in terms of data integration. The limitations of the traditional Management Information Systems (MIS), perceived as being unable to maintain a consistent view of an organization’s reconciled data, was identified as the potential benefit of a Data Warehousing system. To overcome the problems with traditional approaches of accessing large amounts of data in heterogeneous, autonomous distributed systems, the emergence of Data Warehousing introduced the concept of a ‘logically centralized data repository’. Therefore, the concept of Data Warehousing emerged due to the evolution of IS objectives within organizations to the growing demand within organizations to analyze (internal and external) business information.
1.1. 1995 to 2000, Similar to the experiences with Data Warehousing, there was no agreed definition for ERP systems, although their characteristics position these systems as integrated, all-encompassing, complex mega-packages designed to support the key functional areas of an organization. Therefore, by design, an ERP is an operational-level system. By the mid-to-late 1990s ERP systems vendors provided an alternate operational solution to the data integration problem, retiring the previously existing fragmented legacy systems that operated throughout the organization. Furthermore, ERP systems also promised to deliver on the informational requirements of an organization, such is its scope, therefore, the perceived need and along with it, the rate of Data Warehousing project implementations, was reduced. Due to the fact that an ERP systems implementation replaced many of the legacy systems throughout the organization, it can be perceived as the ‘base line application’, containing integrated application data, generated as a ‘by-product of transaction processing’, or as an ‘ODS’ (Operational Data Store), a ‘hybrid structure’ that contains some aspects of a data warehouse and other aspects of a transaction processing environment. Many research studies of ERP implementations have reported how the failure to properly analyze requirements and understand the impact of the changes brought about by ERP implementations has created problems for implementing organizations and has curtailed the extent to which they have been able to derive benefits from their investments. As organizations moved toward the post-implementation phase of their ERP projects, post Y2K for the vast majority of organizations, the real issue of benefit realization emerged. Pallatto added that concessions and compromises in the design of the rushed Y2K ERP projects had negative impacts on systems performance and benefits which were not promptly and fully communicated to the implementing organization.
1.2. 2000 to 2005, One benefit in particular which did not materialize was the provision of an integrated informational platform to facilitate reporting on every aspect of an organization’s activities. This led organizations to reconsider undertaking Data Warehousing projects post-ERP implementation. Therefore post-Y2K, many organizations discovered that the solution to leveraging investment decisions in, and retrieving useful data from, an ERP system was to undertake additional initiatives, for example Data Warehousing; ERP II initiatives embracing the concept of PIM (Product Information Management) and CDI (Customer Data Integration); and Business Intelligence, in conjunction with their already implemented ERP system. Indeed, Ventana Research highlight the fact that over half of the organizations considering MDM have already implemented a PIM or CDI master data deployment. The harsh reality of ERP systems implementation, to the expense of those organizations that invested resources in the initiative, is that ERP only facilitated getting data into the system; it did not prepare data for use and analysis. This is due to the fact that ERP systems lack certain functionality and reporting capabilities. Many organizations experienced frustration when they attempted to use their ERP system to access information and knowledge. It was quickly realized that ERP systems are good for storing, accessing and executing data used in daily transactions, but it is not good at providing the information needed for long term planning and decision making as ERP systems are not designed to know how the data is to be used once it is gathered. As we have argued earlier, this has led to the emergence of the Master Data Management (MDM) concept.
To harness the Enterprise data integration which is the integral part of CDI and avoiding critical mistakes as involved with it, it is important that any enterprise concerned with the integrity of their customer data review the eight critical components of a customer data management environment & they are:
i. Business-driven accuracy definitions and thresholds
ii. Data investigation and analysis
iii. Comprehensive conversion plans – and dress rehearsals
iv. Symmetrical update routines for batch and online worlds
v. Preventative Maintenance
vi. Data Audits
vii. Enhance customer data
viii. Data Stewardship
To hone the data quality with data governance, the organizations further harness the CDI technology with below applied conditions:
1.Does data quality have executive-level sponsorship?
A.Board level
B.Other
C.None
2. Do you have established accuracy definitions and thresholds?
A.Enterprise basis
B.Project basis
C.None
3. Do you conduct regular data quality audits?
A.Internal & external
B.Internal only
C.None
4. Do you have common data entry standards?
A.Across enterprise
B.Within business line
C.None
5. Is data quality awareness part of new staff induction?
A.All staff
B.‘Relevant’ staff only
C.None
6. Do you have a dedicated data quality staff?
A.Team
B.An individual
C.None
7. How is your data quality budget handled?
A.Separate major budget line
B.Separate line in each project
C.Ad hoc funded from project
Total Scoring:
A = 3 points
B = 2 points
C = 1 point
How Does Your Organization’s Data Quality Commitment Stack Up? If you scored 17 or higher, congratulations! You’re among the leaders in taking advantage of the strategic resource that your customer data represents. If you scored between 10 and 16, your organization has taken important steps towards protecting the value of your customer data, but there remains some more work to be done. Your organization’s marketing initiatives and strategic decision-making are probably at risk due to faulty data. If you scored below 10, you should consider strategies for achieving executive buy-in to the importance of data quality. Without a significant change in how your organization views and manages its customer data, you risk losing ground – and customers – to competitors. To achieve a sustained cultural commitment to data quality, you have to be able to justify the investment in real financial terms:
i.Hard ROI
ii.Cost savings
iii.Reduction in operational risk
By connecting the definition of quality and measurement scale to how the data is used, i.e., whether it’s for call centre, marketing, risk management or management information, you can identify or estimate the value of certain customer-based events, such as:
i. The cost of loyal customers who are lost because your call centre didn’t recognize them or didn’t have the right information to properly service them.
ii. The cost of sending duplicate mailings to the same customer or household, including production and postage – and multiplying that over years of multiple mailings.
iii. The money that can be saved by preventing the over-extension of credit to a customer who deals with several different departments of your organization or makes purchases under different aliases.
iv. The value of accurate customer information in making real-time pricing decisions for customers who expect one-to-one personalization.
v.The value of protecting your corporate brand and avoiding costly fines by preventing compliance violations that result from conducting business with a person or business – or their respective aliases – that appear on one or more government sanction lists.
1.3 January 2005, the data quality implementation plans as furnished above for CDI resulted:
i. The evaluation process does not focus attention on identifying the client’s business-driven data quality needs, and how each vendor’s offering relates to these needs
ii. The standard list of features and functions included in the qualifying questionnaire may have little to do with how the client will actually use the data quality software selected
iii. During the proof of concept, the competing vendors’ solutions are often ‘tested’ by using a sample of the client’s data that is not valid in size or composition to produce reliable and meaningful results
iv. The data quality software is implemented on a project-level basis and typically does not take full advantage of the robustness of the selected solution’s enterprise capabilities
Following are important ways that a specialist can strengthen software selection and implementation process:
1. Focusing the Process on the Client’s Needs With this expertise, the specialist can help focus the evaluation and selection process on the client’s own situation, in terms of:
i. How the data quality solution will be used
ii. Appropriate accuracy levels to be achieved to meet the client’s real business needs
iii. IT resources required for a robust implementation and on-going maintenance
iv. Realistic implementation schedules in relation to the client’s own time constraints
2. Generating Creative, Client-Focused Problem Solving:
To take the client-focused approach a step further, the data quality specialist can position part of the RFP as a challenge based on the client’s unique data quality needs and objectives (the bulleted items listed under #1). Each vendor will be asked to present their best solution for maximizing the client’s immediate and on-going data quality performance, within the given parameters.
3. Selecting an Appropriate Sample for Testing.
4. Providing a Robust Implementation. The data quality specialist can help to ensure that:
i.The implementation is robust and takes full advantage of the software’s functionality and capabilities in relation to the client’s needs
ii.Appropriate data quality accuracy levels are established, based on the data’s business uses
iii.A program for regular data quality audits and on-going maintenance is established
5. Key Questions for Selecting a Data Quality Vendor:
i.Determining how the data quality suite will actually be used
ii.Defining the accuracy levels that will be needed to meet the client’s business needs
iii.Profiling the data to determine existing quality levels and identify potential problem areas that must be addressed
iv.Performing a test run using an appropriate statistically significant data sample prior to the full integration)
In Summary, A customer data conversion can be considered successful only if the resulting data meets the needs of both its Business users and IT team. To help achieve this objective, both groups should be included on the conversion planning and implementation team. To help ensure that the conversion is delivered on-time and on-budget, the project should include 6 critical steps:
(1) select an experienced data conversion consultant to lead the project;
(2) use an automated data profiling tool to thoroughly investigate the data;
(3) perform a dress rehearsal using small, statistically significant data samples;
(4) based on the results of the dress rehearsal, update the project estimates and projections;
(5) conduct a large-volume conversion test;
(6) run the conversion.
By working together, the Business and IT users are very nearly guaranteeing a successful