Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
POWERCENTER 8.1.0
DESIGNER
Content
Designer Overview
Designer
Designer is used to create mappings that contain transformation instructions for the
Integration Service. The Designer has the following tools that we use to analyze sources,
design target schemas, and build source-to-target mappings.
Source Analyzer
It imports or creates source definitions.
Target Designer
It imports or creates target definitions.
Transformation Developer
Develop transformations to use in mappings. we can also develop user-defined
functions to use in expressions.
Mapplets Designer
It Creates sets of transformations to use in mappings.
Mapping Designer
It Creates mappings that the Integration Service uses to extract, transform, and load
data.
Designer Overview
The following things are displayed in
Designer
Navigator
It connect to repositories, and open
folders within the Navigator. we can
also copy objects and create
shortcuts within the Navigator.
Workspace
It opens different tools in this
window to create and edit
repository objects, such as sources,
targets, mapplets, transformations,
and mappings.
Output
View details about tasks you
perform, such as saving your work
or validating a mapping.
Designer Windows
integration * intelligence * insight
Designer Overview
Status bar
It Displays the status of the
operation you perform.
Overview
An optional window to simplify
viewing a workspace that
contains a large mapping or
multiple objects. Outlines the
visible area in the workspace
and highlights selected objects
in color.
Instance data
View transformation data while
you run the Debugger to debug
a mapping.
Target data
View target data while you run
the Debugger to debug a
mapping.
Designer Windows
integration * intelligence * insight
Designer Overview
Informatica PowerCenter 8 can access the following data sources
into the following targets .
Application
Hyperion Essbase
Sources
IBM MQSeries
Relational
JMS
Oracle
Microsoft Message Queue
Sybase ASE
Informix
Mainframe
IBM DB2
Adabas
Teradata
IBM DB2 OS/390
File
Flat file
Other
COBOL file
Microsoft Excel
XML file
Microsoft Access
web log
External web services
integration * intelligence * insight
Designer Overview
Targets
Relational
Oracle
Sybase ASE
Informix
IBM DB2
Microsoft SQL Server
Teradata
File
Flat file
XML file
Application
Hyperion Essbase
IBM MQSeries
IBM DB2 OLAP Server
JMS
Microsoft Message Queue
Mainframe
IBM DB2 OS/390
IBM DB2 OS/400
MY SAP
PeopleSoft EPM
SAP BW
SAS
Siebel
TIBCO
WebMethods
VSAM
Other
Microsoft Access
External web services
About Transformation
The transfer of data is called transformation.
A transformation is a repository object that generates, modifies, or passes data.
We configure logic in a transformation that the Integration Service uses to transform data.
About Transformation
Designer Transformations
Tasks to incorporate a Aggregator - to do things like "group by".
Expression - to use various expressions.
transformation into a
mapping
Filter - to filter data with single condition.
Joiner - to make joins between separate databases, file, ODBC sources.
Create the
Lookup - to create local copy of the data.
transformation
Normalizer - to transform denormalized data into normalized data.
Configure
the
Rank - to select only top (or bottom) ranked data.
transformation
Sequence Generator - to generate unique IDs for target tables.
Link the
Source Qualifier - to filter sources (SQL, select distinct, join, etc.)
transformation to
Stored Procedure - to run stored procedures in the database - and
other
capture their returned values.
transformations
and target
Update Strategy - to flag records in target for insert, delete, update
definitions
(defined inside a mapping).
Router - same as filter but with multiple conditions
Mapping Designer
Java Transformation- It provides a simple native programming interface
to define transformation functionality with the Java programming
Transformation
Developer
language.
Reusable transformation- is a transformation that can be used in
Mapplet Designer
multiple mappings
integration * intelligence * insight
Types Of Transformation
Active Transformation
Filter
Router
Update Strategy
Aggregator
Sorter
Rank
Joiner
Normalizer
Passive transformations
Sequence Generator
Stored Procedure
Expression
Lookup
10
Aggregator Transformation
The Aggregator is an active transformation.
The Aggregator transformation allow us to perform aggregate calculations, such as averages
and sums.
The Aggregator transformation is unlike the Expression transformation, in that we can use the
Aggregator transformation to perform calculations on groups.
The Expression transformation permit us to perform calculations on a row-by-row basis only.
We can use conditional clauses to filter rows, providing more flexibility than SQL language.
The Integration Services performs aggregate calculations as it reads, and stores
necessary
data group and row data in an aggregate cache.
Components of the Aggregator Transformation
Aggregate expression
Group by port
Sorted input
Aggregate cache
Aggregate Expression
An aggregate expression can include conditional clauses and non-aggregate functions. It can
also include one aggregate function nested within another aggregate function, such as.
MAX( COUNT( ITEM )
Aggregate Functions
The aggregate functions can be used within an Aggregator transformation.
We can nest one aggregate function within another aggregate function.
AVG
COUNT
integration * intelligence * insight
11
Aggregator Transformation
Aggregate Functions
FIRST
LAST
MEDIAN
MAX
MIN
STDDEV
PERCENTILE
SUM
VARIANCE
Conditional Clauses
We use conditional clauses in the aggregate expression to reduce the number of rows used
in the aggregation. The conditional clause can be any clause that evaluates to TRUE or
FALSE.
Null Values in Aggregate Functions
When we configure the Integration Service, we can choose how we want the Integration
Service to handle null values in aggregate functions. We can choose to treat null values in
aggregate functions as NULL or zero. By default, the Integration Service treats null values
as NULL in aggregate functions.
12
13
Expression Transformation
We can use the Expression transformation to calculate values in a single row before we
write to the target
We can use the Expression transformation to test conditional statements
To perform calculations involving multiple rows, such as sums or averages we can use
aggregator transformation
We can use the Expression transformation to perform any non-aggregate calculations
Creating an Expression Transformation
In the Mapping Designer, click Transformation > Create.
Select the Expression transformation and click OK.
The naming convention for Expression transformations is
EXP_TransformationName
Create the input ports
If we have the input transformation available, we can
select Link Columns from the Layout menu and then
drag each port used in the calculation into the
Expression transformation or we can open the
transformation and create each port manually.
Repeat the previous step for each input port we want to
add to the expression
Create the output ports we need
integration * intelligence * insight
14
Expression Transformation
Setting Expression in Expression
Transformation
Enter the expression in the
Expression Editor we have
disable to in port.
Check the expression syntax by
clicking Validate.
Connect to Next Transformation
Connect the output ports to the
next transformation or target.
Select a Tracing Level on the
Properties Tab
Select a tracing level on the
Properties tab to determine the
amount of transaction detail
reported in the session log file.
Choose Repository-Save.
15
Filter Transformation
A Filter transformation is an Active Transformation.
We can filter rows in a mapping with Filter transformation.
We pass all the rows from a source transformation through the Filter transformation and
then enter a filter condition for the transformation.
All ports in a Filter transformation are input/output, and only rows that meet the condition
pass through the Filter transformation.
Creating a Filter Transformation
In the Mapping Designer, click Transformation > Create.
Select the Filter transformation. Enter a name, and click
OK.
The naming convention for Filter transformations is
FIL_TransformationName.
Select and drag all the ports from a source qualifier or other
transformation to add them to the Filter transformation.
After we select and drag ports, copies of these ports
appear in the Filter transformation. Each column has both
an input and an output port.
Double-click the title bar of the filter transformation to edit
transformation properties.
integration * intelligence * insight
16
Filter Transformation
Click the Value section of the condition, and then click the Open button.
The Expression Editor appears.
Enter the filter condition we want to apply.
Use values from one of the input ports in the transformation as part of this condition
However, we can also use values from output ports in other transformations.
We may have to fix syntax errors before continuing.
Click OK.
Select the Tracing Level, and click OK to return to the Mapping Designer.
Choose Repository-Save.
17
Joiner Transformation
A Joiner transformation is an active transformation.
Joiner transformation is used to join source data from two related heterogeneous sources
residing in different locations or file systems.
We can also join data from the same source.
The Joiner transformation joins sources with at least one matching column.
The Joiner transformation uses a condition that matches one or more pairs of columns
between the two sources.
We can use the following sources
18
Joiner Transformation
Drag all the input/output ports from the first source into the Joiner transformation.
The Designer creates input/output ports for the source fields in the Joiner transformation as
detail fields by default. We can edit this property later.
Select and drag all the input/output ports from the second source into the Joiner
transformation.
The Designer configures the second set of source fields and master fields by default.
Edit Transformation
Double-click the title bar of the Joiner
transformation to open the
Edit
Transformations dialog box.
Select the port tab.
Add default values for specific ports
as necessary.
Setting the Condition
Select the Condition tab and set the
condition.
Click the Add button to add a
condition.
Click the Properties tab and configure
properties for the transformation.
Click OK .
integration * intelligence * insight
19
Joiner Transformation
Defining the Join Type
Join is a relational operator that combines data from multiple tables into a single result set.
We define the join type on the Properties tab in the transformation.
The Joiner transformation supports the following types of joins.
Normal
Master Outer
Detail Outer
Full Outer
20
Lookup Transformation
A Lookup transformation is a passive transformation.
Use a Lookup transformation in a mapping to look up data in a flat file or a relational table,
view, or synonym.
We can import a lookup definition from any flat file or relational database to which both the
PowerCenter Client and Integration Service can connect.
We can Use multiple Lookup transformations in a mapping.
The Integration Service queries the lookup source based on the lookup ports in the
transformation.
It compares Lookup transformation port values to lookup source column values based on
the lookup condition.
21
Unconnected Lookup
value
to
another
22
Lookup Transformation
Tasks Of Lookup Transformations
Get a related value.
Perform a calculation.
Update slowly changing dimension tables.
Connected or unconnected.
Cached or uncached.
Lookup Components
We have to define the following components when we
configure a Lookup transformation in a mapping.
Lookup source
Ports
Properties
Condition
Edit Transformation
Metadata extensions
integration * intelligence * insight
23
Lookup Transformation
Creating a Lookup Transformation
In
the
Mapping
Designer,
click
Transformation > Create. Select the
Lookup transformation. Enter a name for
the transformation and Click OK. The
naming
convention
for
Lookup
transformation
is
LKP_Transformation
Name.
In the Select Lookup Table dialog box, we
can choose the following options.
Choose an existing table or file
definition.
Choose to import a definition from a
relational table or file.
Skip to create a manual definition.
If we want to manually define the lookup
transformation, click the Skip button.
Define input ports for each
condition we want to define.
Lookup
24
Lookup Transformation
For Lookup transformations that use a dynamic lookup
cache, associate an input port or sequence ID with each
lookup port.
edit
Edit Transformation
25
Lookup Transformation
Setting the properties to port tab And properties tab
Properties Tab
Port Tab
Lookup Transformation Tips
Add an index to the columns used in a
lookup condition
Place conditions with an equality
operator (=) first.
Cache small lookup tables.
Lookup
the
:LKP
26
Lookup Caches
The Integration Service builds a cache in memory when it processes the first row of data in
a cached Lookup transformation.
It allocates memory for the cache based on the amount we configure in the transformation
or session properties.
The Integration Service stores condition values in the index cache and output values in the
data cache.
The Integration Service queries the cache for each row that enters the transformation.
The Integration Service also creates cache files by default in the $PMCacheDir.
Types of lookup caches
Persistent cache
Recache from database
Static cache
Dynamic cache
Shared cache
integration * intelligence * insight
27
Rank Transformation
The Rank transformation is Active Transformation
The Rank transformation allow us to select only the top
or bottom rank of data.
The
Rank
transformation
differs
from
the
transformation functions MAX and MIN, to select a
group of top or bottom values, not just one value.
Creating Rank Transformation
In the Mapping Designer, click Transformation >
Create. Select the Rank transformation. Enter a
name for the Rank. The naming convention for
Rank transformations is
RNK_TransformationName.
Enter a description for the transformation. This
description appears in the Repository Manager.
Click Create, and then click Done.
The Designer creates the Rank transformation.
Link columns from an input transformation to the
Rank transformation.
Click the Ports tab, and then select the Rank (R)
option for the port used to measure ranks.
If we want to create groups for ranked rows,
select Group By for the port that defines the
group.
integration * intelligence * insight
Port Tab
28
Rank Transformation
Click the Properties tab and select whether we
want the top or bottom rank
For the Number of Ranks option, enter the
number of rows we want to select for the rank.
Change the other Rank transformation
properties, if necessary.
Click OK.
Click Repository > Save.
Properties Tab
29
30
31
T_ORDERS_FOR
EIGN TABLE:
PRIMARY KEY
10
32
T_ORDERS_PRIMA
RY TABLE:
PRIMARY KEY
T_ORDERS_FOREIG
N TABLE:
PRIMARY KEY
33
NEXTVAL
CURRVAL
6
34
OUTPUT
Current Value = 1, Increment By = 1
When we run the workflow, the Integration
Service generates the following constant
values for CURRVAL.
CURRVAL
1
1
1
1
1
35
36
37
38
39
40
Update Strategy
An Update Strategy is an active transformation .
When we design a data warehouse, we need to decide what type of information to store in
targets. As part of the target table design, we need to determine whether to maintain all the
historic data or just the most recent changes.
The model we choose determines how we handle changes to existing rows. In
PowerCenter, we set the update strategy at two different levels.
Within a session
Within a mapping
41
Update Strategy
Creating an Update Transformation
In the Mapping Designer, select Transformation-Create. Select the
Update
transformation. The naming convention for Update
transformations is UPD_TransformationName.
Enter a name for the Update transformation , and click Create. Click
Done.
The Designer creates the Update transformation.
Drag all ports from another transformation representing data we
want to pass through the Update Strategy transformation.
In the Update Strategy transformation, the Designer creates a copy
of each port we drag. The Designer also connects the new port to
the original port. Each port in the Update Strategy transformation is a
combination of input/output port.
Normally, we would select all of the columns destined for a particular
target. After they pass through the Update Strategy transformation,
this information is flagged for update, insert, delete, or reject.
Double-click the title bar of the transformation to open the Edit
Transformations dialog box.
Click the Properties tab.
Click the button in the Update Strategy Expression field.
The Expression Editor appears.
integration * intelligence * insight
42
Update Strategy
Enter an update strategy expression to flag
rows as inserts, deletes, updates, or rejects.
Validate the expression and click OK.
Click OK to save the changes.
Connect the ports in the Update Strategy
transformation to another transformation or a
target instance.
Click Repository > Save
Setting the Update Strategy for a Session
When we configure a session, we have
several options for handling specific database
operations, including updates.
Specifying an Operation for All Rows
When we configure a session, we can select
a single database operation for all rows using
the Treat Source Rows As setting.
Configure the Treat Source Rows As session
property.
Treat Source Rows displays the options like.
Insert
Delete
Update
Data Driven
integration * intelligence * insight
43
Update Strategy
Specifying Operations for Individual Target Tables
Once we determine how to treat all rows in the
session, we also need to set update strategy options
for individual targets. Define the update strategy
options in the Transformations view on Mapping tab
of the session properties.
We can set the following update strategy options for
Individual Target Tables.
Insert. Select this option to insert a row into a target
table.
Delete. Select this option to delete a row from a
table..
Update. You have the following options in this
situation.
Update as Update. Update each row flagged
for update if it exists in the target table.
Update as Insert. Inset each row flagged for
update.
Update else Insert. Update the row if it exists.
Otherwise, insert it.
Truncate table. Select this option to truncate the
target table before loading data.
44
Router Transformation
A Router transformation is an Active Transformation.
A Router transformation is similar to a Filter transformation because both transformations
allow us to use a condition to test data.
A Filter transformation tests data for one condition and drops the rows of data that do not
meet the condition. However, a Router transformation tests data for one or more conditions
and gives us the option to route rows of data that do not meet any of the conditions to a
default output group.
If we need to test the same input data based on multiple conditions, use a Router
transformation in a mapping instead of creating multiple Filter transformations to perform
the same task.
Creating a Router Transformation
In the Mapping Designer, click Transformation >
Create. Select the Router transformation. Enter a
name for the transformation and Click OK.
The naming convention for router transformation is
RTR_TransformationName.
Input values in the Router Transformation
Select and drag all the desired ports from a
transformation to add them to
the Router
transformation.
Double-click the title bar of the Router
transformation to edit transformation properties.
integration * intelligence * insight
45
Router Transformation
Setting the properties to port tab And properties tab
Port Tab
Properties Tab
46
Router Transformation
A Router transformation
following types of groups.
has
the
Input
Output
There are two types of output groups.
User-defined groups
Default group
47
Router Transformation
Connecting Router Transformations in a Mapping
When we connect transformations to a Router
transformation in a mapping consider the following
rules.
We can connect one group to one transformation or
target.
Connect one port to Multiple Target
We can connect one output port in a group to
multiple transformations or targets.
Connect Multiple out ports to Multiple Target
We can connect multiple output ports in one group
to multiple transformations or targets.
48
Reusable Transformation
49
Reusable Transformation
Creating Reusable Transformation
Goto
transformation
developer<Transformation <create.
To promote an existing transformation to
re-usable: Goto mapping designer>double
click on transformation>>transformation
tab>make reusable.
Changes that can invalidate mapping
When we delete a port or multiple ports in a
transformation.
When we change a port datatype, you
make it impossible to map data from that
port to another port using an incompatible
datatype.
When we
change a port name,
expressions that refer to the port are no
longer valid.
When we enter an invalid expression in the
reusable transformation, mappings that use
the transformation are no longer valid. The
Integration Service cannot run sessions
based on invalid mappings
50
Java Transformation
The Java transformation is a Active/Passive Connected transformation that provides a
simple native programming interface to define transformation functionality with the Java
programming language.
You create Java transformations by writing Java code snippets that define transformation
logic.
The Power Center Client uses the Java Development Kit (JDK) to compile the Java code
and generate byte code for the transformation. The Integration Service uses the Java
Runtime Environment (JRE) to execute generated byte code at run time.
Steps To Define Java Transformation
Create the transformation in the Transformation Developer or Mapping Designer.
Configure input and output ports and groups for the transformation. Use port names as
variables in Java code snippets.
Configure the transformation properties.
Use the code entry tabs in the transformation to write and compile the Java code for the
transformation.
Locate and fix compilation errors in the Java code for the transformation.
51
Java Transformation
Enter the ports and use that ports as identifier in java
code.
Go to java code and enter the java code and click
compile and check the output in the output window.
Create session and workflow and run the session.
Functions
Some functions used in designer are
AVG
Syntax : AVG( numeric_value [, filter_condition ] )
MAX
Syntax: MAX( value [, filter_condition ] )
MIN
Syntax :MIN( value [, filter_condition ] )
INSTR
Syntax :INSTR( string, search_value [, start [,
occurrence ] ] )
SUBSTR
Syntax :SUBSTR( string, start [, length ] )
IS_DATE
Syntax:IS_DATE( value )
integration * intelligence * insight
52
53
Properties tab
We can edit
the default numeric and
datetime format properties in the Source
Analyzer and the Target Designer.
Metadata Extensions tab
We can extend the metadata stored in the
repository by associating information with
repository objects, such as flat file
definitions.
54
55
Click OK
The Designer assigns the data type of the data the expression returns. The data types have
the precision and scale of transformation data types.
Click OK
The expression displays in the User-Defined Function Browser dialog box.
56
Mapplet Designer
A mapplet is a reusable object that we create in the Mapplet Designer. It contains a set of
transformations and we reuse that transformation logic in multiple mappings.
When we use a mapplet in a mapping, we use an instance of the mapplet. Like a reusable
transformation, any change made to the mapplet is inherited by all instances of the mapplet.
Usage of Mapplets
Include source definitions
Use multiple source definitions and source qualifiers to provide source data for a mapping.
Accept data from sources in a mapping
If we want the mapplet to receive data from the mapping, we use an Input transformation to
receive source data.
Include multiple transformations
A mapplet can contain as many transformations as you need.
Pass data to multiple transformations
We can create a mapplet to feed data to multiple transformations.
Contain unused ports
We do not have to connect all mapplet input and output ports in a mapping
integration * intelligence * insight
57
Mapplet Designer
Limitations of Mapplets
We cannot connect a single port in the Input transformation to multiple transformations in
the mapplet.
58
Data Profiling
Data profiling is a technique used to analyze source data. PowerCenter Data Profiling can
help us to evaluate source data and detect patterns and exceptions. we can profile source
data to suggest candidate keys, detect data patterns and evaluate join criteria.
Use Data Profiling to analyze source data in the following situations.
During mapping development .
During production to maintain data quality.
To profile source data, we create a data profile. we can create a data profile based on a
source or mapplet in the repository. Data profiles contain functions that perform calculations
on the source data.
The repository stores the data profile as an object. we can apply profile functions to a
column within a source, to a single source, or to multiple sources.
We can create the following types of data profiles.
Auto profile
Contains a predefined set of functions for profiling source data. Use an auto profile during
mapping development.
Custom profile
Use a custom profile during mapping development to validate documented business rules
about the source data. we can also use a custom profile to monitor data quality or validate
the results of BI reports.
59
Data Profiling
60
Data Profiling
Optionally, click Save As Default to create new default functions based on the functions
selected here.
Optionally, click Profile Settings to enter settings for domain inference and structure inference
tuning.
Optionally, modify the default profile settings and click OK.
Click Configure Session to configure the session properties after you create the data profile.
Click Next if you selected Configure Session, or click Finish if you disabled Configure Session.
The Designer generates a data profile and profile mapping based on the profile functions.
Configure the Profile Run options and click Next.
Configure the Session Setup options.
Click Finish.
61
Data Profiling
62
Profile Manager
Profile Manager is a tool that helps to
manage data profiles. It is used to set default
data profile options, work with data profiles in
the repository, run profile sessions, view
profile results, and view sources and
mapplets with at least one profile defined for
them. When we launch the Profile Manager,
we can access profile information for the
open folders in the repository.
There are two views in the Profile Manager
Profile View
Source View
integration * intelligence * insight
63
Debugger Overview
We can debug a valid mapping to gain troubleshooting information about data and error
conditions.
Debugger used in the following situations
Before we run a session
After we save a mapping, we can run some initial tests with a debug session
before we create and configure a session in the Workflow Manager.
After we run a session
If a session fails or if we receive unexpected results in the target, we can run the
Debugger against the session. we might also run the Debugger against a session if
we want to debug the mapping using the configured session properties.
Create breakpoints. Create breakpoints in a mapping where we want the Integration
Service to evaluate data and error conditions.
Configure the Debugger. Use the Debugger Wizard to configure the Debugger for the
mapping. Select the session type the Integration Service uses when it runs Debugger.
Run the Debugger. Run the Debugger from within the Mapping Designer. When we run
the Debugger the Designer connects to the Integration Service. The Integration Service
initializes the Debugger and runs the debugging session and workflow.
Monitor the Debugger. While we run the Debugger, we can monitor the target data,
transformation and mapplet output data, the debug log, and the session log.
Modify data and breakpoints. When the Debugger pauses, we can modify data and see
the effect on transformations, mapplets, and targets as the data moves through the
pipeline. we can also modify breakpoint information.
integration * intelligence * insight
64
Debugger Overview
Create Breakpoints
Goto
mapping<<debugger<<edit
transformations.
Choose the instant name, breakpoint
type.
And then ADD to add the breakpoints.
Give the condition for data breakpoint
type.
Give no. of errors before we want to
stop.
Run The Debugger
Got mapping<debugger<start debugger
Click next and then choose the session
as create debug session other wise
choose existing session
Click on next
65
Debugger Overview
Choose connections of target and
source and click next.
Click on next
Debug Indicators
66
The End
67