Sei sulla pagina 1di 13

INFORMATICA DVO

Introduction
Topics Covered
• Purpose of data validation and role of DVO
• DVO Architecture
• Benefits of DVO
• Real time scenario high level procedure explanation
Purpose of Data Validation and role of DVO
• Data Validation is the process of verifying the accuracy and
completeness of data after a data migration, replication, integration
or other similar movement or transformation exercise.
• DVO integrates with the Informatica PowerCenter Repository and
Integration Services and enables developers and business analysts to
create rules to test the data.
DVO
Architecture
Interaction with PowerCenter

DATA VALIDATION SOURCE AND TARGET YOU SET UP TABLE PAIRS WHEN THE TESTS ARE POWERCENTER AFTER THE TESTS ARE
OPTION REQUIRES DATA TABLE AND FILE AND TEST RULES IN DATA RUN, DVO CONNECTS TO THE DATA EXECUTED, RESULTS ARE
INSTALLATION AND SETUP DEFINITIONS ARE VALIDATION OPTION. THIS COMMUNICATES WITH BEING TESTED INSTEAD OF STORED IN THE DATA
OF POWERCENTER. IMPORTED FROM TEST METADATA IS POWERCENTER THROUGH DATA VALIDATION VALIDATION OPTION
POWERCENTER STORED IN THE DATA AN API TO CREATE OPTION. REPOSITORY AND
REPOSITORIES. VALIDATION OPTION APPROPRIATE MAPPINGS, DISPLAYED IN THE DVO
REPOSITORY. SESSIONS, AND CLIENT.
WORKFLOWS, AND TO
EXECUTE THEM.
Benefits of DVO
• DVO reduces the time required for data validation and production data
auditing and verification significantly, in comparison to traditional methods
like SQL validation with minus queries, row counts etc.
• Maintaining different test scripts to validate data for different projects is
cumbersome. DVO provides a easy-to-use GUI interface to test the rules
created for data validations for multiple projects.
• No programming skills needed to create validation tests.
• DVO includes a repository with reporting capabilities to provide a complete
audit trail of all tests and their results.
• It reads data definitions from PowerCenter metadata repositories and can
easily deal with data definition changes.
Real time scenario procedure explanation
• Data validation taken place during migration of mappings for version
upgrade.
• Upgrade was from Informatica PowerCenter 9.x to 10.
• Tables involved : 78
• Mappings involved : 500+
• Type of Data : Healthcare (very high volume)
• Data nature : Incremental
• Time taken for Validation :20-25 days
Steps Followed from a Developers Perspective
• Folder migration was done from 9.x repository to 10.
• Migrating folders will cover source tables, target tables mapping and
workflows.
• Separate test repositories were created as the actual workflows were
scheduled. Changes in XML were made during migration pointing to the
new dB connections.
• Test tables were created in the database for both the power Center
versions. As the data is incremental and is subjected to daily change,
separate test table creation is mandatory in order to verify data integrity.
• Test tables were the exact replicas of the actual tables created in different
dB connections.
• After ensuring successful migration of power Center elements, the
workflows were triggered parallelly in both 9.x and 10 versions to load
them with the same data.
• The DVO connections were established connecting to the respective
informatica repositories.
• Folders for each project were created under the respective connections.
• Table pairs for all source and targets were created.
• Rule:
• 9.x src_table ||10 src_table
• 9.x tgt_table ||10 tgt_table
• Test rules were defined.
• Table pair tests were run.
• Reports were generated and submitted.

Potrebbero piacerti anche