Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
This software and documentation contain proprietary information of Informatica Corporation, and are provided under a license agreement containing
restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without the prior written consent of Informatica
Corporation.
Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as
provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013(c)(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR
52.227-14 (ALT III), as applicable.
The information in this product or documentation is subject to change without notice. If you find any problems in the software or documentation, please report
them to us in writing. Informatica Corporation does not warrant that this product or documentation is error free.
Informatica, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica
Data Quality and Informatica Data Explorer are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions
throughout the world. All other company and product names may be trade names or trademarks of their respective owners. U.S. Patent Pending.
DISCLAIMER: Informatica Corporation provides this documentation “as is” without warranty of any kind, either express or implied, including, but not limited
to, the implied warranties of non-infringement, merchantability, or use for a particular purpose. The information provided in this documentation may include
technical inaccuracies or typographical errors. Informatica could make improvements and/or changes in the products described in this documentation at any
time without notice.
Table of Contents
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxi
About This Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxii
Document Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxii
Other Informatica Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xxiii
Visiting Informatica Customer Portal . . . . . . . . . . . . . . . . . . . . . . . . .xxiii
Visiting the Informatica Web Site . . . . . . . . . . . . . . . . . . . . . . . . . . . .xxiii
Visiting the Informatica Knowledge Base . . . . . . . . . . . . . . . . . . . . . . .xxiii
Obtaining Technical Support . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xxiii
iv Table of Contents
Changes to Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Configuration Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Upgrading BW Option . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Creating Profiles for Production and Development Users . . . . . . . . . . . . . . . 48
Renaming the saprfc.ini File on Windows . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Defining PowerCenter as a Logical System in SAP BW . . . . . . . . . . . . . . . . . 52
Configuring the saprfc.ini File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
saprfc.ini Entry Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Sample saprfc.ini File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Configuring an Entry in the saprfc.ini File . . . . . . . . . . . . . . . . . . . . . . 54
Creating the SAP BW Service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Load Balancing for the BW System and the SAP BW Service . . . . . . . . . 56
Steps to Create the SAP BW Service . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Importing the ABAP Program into SAP BW . . . . . . . . . . . . . . . . . . . . . . . . 59
Troubleshooting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Table of Contents v
Troubleshooting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
vi Table of Contents
Chapter 7: Application Source Qualifier for SAP R/3 Sources . . . . 119
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Generating an ABAP Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Available ABAP Generation Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Generating Open SQL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Generating Exec SQL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Generating ABAP Join Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
Working with ABAP Program Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Validating the ABAP Program Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Joining Source Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Joining Sources Using Open SQL . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Joining Sources Using Exec SQL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Joining Sources Using ABAP Join Syntax . . . . . . . . . . . . . . . . . . . . . . . 129
Selecting a Join Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Using Multiple Outer Joins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Joining Tables and Hierarchies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Joining Tables and IDocs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Specifying Join Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Creating an ABAP Code Block . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Rules for Inserting the ABAP Code Block . . . . . . . . . . . . . . . . . . . . . . 136
Creating ABAP Program Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Naming Convention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Creating Structure and Structure Field Variables . . . . . . . . . . . . . . . . . 137
Creating ABAP Type Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Viewing ABAP Program Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Using SAP System Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
Entering a Source Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
Using Dynamic Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Using Static Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Using Mapping Variables and Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Using Mapping Variables in the ABAP Program Flow . . . . . . . . . . . . . . 145
Working with SAP Date Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
Working with IDoc Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Working with IDoc Sources in the ABAP Program Flow . . . . . . . . . . . . 147
Entering an IDoc Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
Validating the IDoc Filter Condition . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Creating and Configuring an Application Source Qualifier . . . . . . . . . . . . . 150
Table of Contents ix
Chapter 12: Working with RFC/BAPI Function Mappings . . . . . . . . 227
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
Extracting Data from SAP Using RFC/BAPI Functions . . . . . . . . . . . . . 228
Changing Data in SAP Using BAPI Functions . . . . . . . . . . . . . . . . . . . 229
Writing Data into SAP Using BAPI Functions . . . . . . . . . . . . . . . . . . . 229
Mapplets in RFC/BAPI Function Mappings . . . . . . . . . . . . . . . . . . . . . 230
Working with RFC/BAPI No Input Mappings . . . . . . . . . . . . . . . . . . . . . . 232
Source Definition in an RFC/BAPI No Input Mapping. . . . . . . . . . . . . 233
Source Qualifier in an RFC/BAPI No Input Mapping . . . . . . . . . . . . . . 233
FunctionCall_No_Input Mapplet in an RFC/BAPI No
Input Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
Target Definition in a No Input Mapping . . . . . . . . . . . . . . . . . . . . . . 235
Working with RFC/BAPI Multiple Stream Mappings . . . . . . . . . . . . . . . . . 236
Source Definition in an RFC/BAPI Multiple Stream Mapping . . . . . . . . 238
Source Qualifier for an RFC/BAPI Multiple Stream Mapping . . . . . . . . 239
Prepare Mapplet in an RFC/BAPI Multiple Stream Mapping . . . . . . . . . 240
RFCPreparedData Target Definition in an RFC/BAPI
Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
RFCPreparedData Source Definition in an RFC/BAPI
Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Source Qualifier for RFCPreparedData Source Definition
in an RFC/BAPI Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . 242
Transaction_Support Mapplet in an RFC/BAPI
Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
FunctionCall_MS Mapplet in an RFC/BAPI
Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
Target Definition in an RFC/BAPI Multiple Stream Mapping . . . . . . . . 244
Working with RFC/BAPI Single Stream Mappings . . . . . . . . . . . . . . . . . . . 246
Source Definition in an RFC/BAPI Single Stream Mapping . . . . . . . . . 247
Source Qualifier in an RFC/BAPI Single Stream Mapping . . . . . . . . . . 250
Sorter Transformation in an RFC/BAPI Single Stream Mapping . . . . . . 251
FunctionCall_SS Mapplet in an RFC/BAPI Single Stream Mapping . . . 251
Target Definition in an RFC/BAPI Single Stream Mapping . . . . . . . . . . 252
Working with Custom Return Structures . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Generating SAP RFC/BAPI Mappings . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Working with Function Input Data for RFC/BAPI Functions . . . . . . . . . . . 261
x Table of Contents
Chapter 13: Configuring RFC/BAPI Function Sessions . . . . . . . . . . 263
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
Copying RFC/BAPI Function Mappings . . . . . . . . . . . . . . . . . . . . . . . 264
Configuring a Session for SAP RFC/BAPI Function Mappings . . . . . . . . . . 265
Error Handling for RFC/BAPI Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
Table of Contents xi
Identifying and Verifying the Basic IDoc Type in the Listener Mapping . 297
Step 3. Configure and Start the Listener Workflow . . . . . . . . . . . . . . . . . . . 299
Step 4. Create the Processing Mappings . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Update Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Request Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Non-Hierarchy and Hierarchy DataSource Processing Mappings . . . . . . 302
Steps to Create a Processing Mapping . . . . . . . . . . . . . . . . . . . . . . . . . 304
Generating and Executing SQL for the Relational Targets . . . . . . . . . . . 311
Step 5. Deploy the Request Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Step 6. Create the Send Request Workflows . . . . . . . . . . . . . . . . . . . . . . . . 313
Step 7. Create the Processing Workflows . . . . . . . . . . . . . . . . . . . . . . . . . . 314
Creating a Processing Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
Creating a Cleanup Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
Configuring the Processing Workflow . . . . . . . . . . . . . . . . . . . . . . . . . 315
Step 8. Schedule the Processing and Send Request Workflows . . . . . . . . . . . 316
Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
Steps to Schedule the Processing and Send Request Workflows . . . . . . . 318
Troubleshooting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415
Table of Contents xv
xvi Table of Contents
List of Figures
Figure 1-1. PowerCenter and SAP NetWeaver Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Figure 1-2. Extracting Data from BW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Figure 1-3. Loading Data into BW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Figure 1-4. BW Data Loading Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Figure 4-1. Import Table Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Figure 4-2. Import Tables with All Keys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Figure 4-3. Import Tables Without All Keys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Figure 4-4. Sample Uniform Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Figure 4-5. Sample Non-Uniform Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Figure 4-6. Import Hierarchy Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Figure 4-7. Sample Hierarchy Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Figure 4-8. Imported Hierarchy Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Figure 4-9. Detail Table Name for Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Figure 4-10. Import IDoc Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Figure 4-11. IDoc Control Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Figure 4-12. Definitions Organized in Navigator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Figure 5-1. Hierarchy Properties Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Figure 5-2. IDoc Properties Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Figure 5-3. Generate and Install ABAP Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Figure 5-4. Installed Programs Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Figure 6-1. SAP Functions Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Figure 6-2. SAP Function Parameters in the ABAP Program Flow Dialog Box . . . . . . . . . . . . 114
Figure 7-1. Select ABAP Generation Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
Figure 7-2. ABAP Program Flow Dialog Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Figure 7-3. Using SAP System Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
Figure 8-1. Session Properties for a Stream Mode Session . . . . . . . . . . . . . . . . . . . . . . . . . . 156
Figure 8-2. Session Properties for a File Mode Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
Figure 9-1. IDoc Display Tab of the SAP/ALE IDoc Interpreter Transformation . . . . . . . . . 179
Figure 9-2. Segments and Groups in the IDoc Display Tab . . . . . . . . . . . . . . . . . . . . . . . . . 181
Figure 9-3. Segment Status Column Selections when Show Group Status is Cleared . . . . . . . 183
Figure 9-4. SAP/ALE IDoc Transformation Buttons on the Transformations Toolbar . . . . . . 184
Figure 9-5. Outbound IDoc Mapping with Invalid IDoc Error Handling . . . . . . . . . . . . . . . 192
Figure 10-1. Inbound IDoc Mapping with Invalid IDoc Error Handling . . . . . . . . . . . . . . . 203
Figure 11-1. Outbound IDoc Session Conditions in the Session Properties . . . . . . . . . . . . . . 208
Figure 11-2. Configuring a Session for Invalid IDoc Error Handling . . . . . . . . . . . . . . . . . . 214
Figure 12-1. Viewing RFC/BAPI Function Metadata from the Mapplet Input Output
Ports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
Figure 12-2. RFC/BAPI Mapping for a No-Input BAPI Function Signature . . . . . . . . . . . . . 233
Figure 12-3. Mapplet in a No Input RFC/BAPI Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . 234
Figure 12-4. FunctionCall_No_Input Mapplet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
Figure 12-5. RFC/BAPI Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
List of Figures xv
Figure 12-6. Prepare Mapplet in an RFC/BAPI Multiple Stream Mapping . . . . . . . . . . . . . . .240
Figure 12-7. RFCPreparedData Target Definition in an RFC/BAPI Multiple Stream
Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .241
Figure 12-8. RFCPreparedData Source Definition in an RFC/BAPI Multiple Stream
Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .242
Figure 12-9. Source Qualifier for RFCPreparedData Source Definition in an RFC/BAPI
Multiple Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .242
Figure 12-10. Transaction_Support Mapplet in the Function Call Pipeline in a Multiple
Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .243
Figure 12-11. FunctionCall_MS Mapplet for BAPI_EMPLOYEE_GETLIST . . . . . . . . . . . . .244
Figure 12-12. RFC/BAPI Single Stream Mapping for BAPI_PLANNEDORDER_CREATE . .247
Figure 12-13. Sample Comma Delimited Flat File Source . . . . . . . . . . . . . . . . . . . . . . . . . . .250
Figure 12-14. Flat File Source Definition for BAPI_PLANNEDORDER_CREATE Single
Stream Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .250
Figure 12-15. Sorter Transformation in an RFC/BAPI Single Stream Mapping . . . . . . . . . . . .251
Figure 12-16. FunctionCall_SS Mapplet in an RFC/BAPI Single Stream Mapping . . . . . . . . .252
Figure 12-17. Return Structure Tab on the Generate SAP RFC/BAPI Mapping Wizard . . . . . .254
Figure 14-1. SAP DMI Prepare Transformation Button on the Transformations Toolbar . . . .278
Figure 15-1. Business Content Integration Workflow Interaction . . . . . . . . . . . . . . . . . . . . . .292
Figure 15-2. Listener Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .299
Figure 15-3. Processing Mapping for a Non-Hierarchy DataSource . . . . . . . . . . . . . . . . . . . .303
Figure 15-4. Processing Mapping for a Hierarchy DataSource . . . . . . . . . . . . . . . . . . . . . . . .304
Figure 15-5. Configuring a Delta Update for a Non-Hierarchy DataSource . . . . . . . . . . . . . .308
Figure 15-6. Configuring a Full Update for a Hierarchy DataSource . . . . . . . . . . . . . . . . . . .309
Figure 15-7. Processing Workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .314
Figure 15-8. Scheduling the Processing and Send Request Workflows . . . . . . . . . . . . . . . . . .317
Figure 17-1. Sample SAP BW Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .339
Figure 18-1. Sample Dynamic Filter Condition in the ABAP Program Flow Dialog Box . . . . .356
Figure 18-2. $$BWFILTERVAR Mapping Parameter for SAP BW Data Selection Entries . . . .357
Figure 19-1. Configuring a Process Chain that Loads to the PSA Only . . . . . . . . . . . . . . . . . .366
Figure 19-2. Configuring a Process Chain that Loads to the PSA and a Data Target . . . . . . . .366
Figure C-1. Unicode Data Processing with Different Code Pages . . . . . . . . . . . . . . . . . . . . . .405
xx List of Tables
Preface
xxi
About This Book
The PowerCenter Connect for SAP NetWeaver User Guide provides information to build
mappings and run sessions to extract data from SAP NetWeaver into a data warehouse and
write data into SAP NetWeaver. It is written for the data warehouse developers and software
engineers who are responsible for extracting data from SAP NetWeaver into a data warehouse
or loading data into SAP NetWeaver.
This guide assumes you have knowledge of relational database concepts and the database
engines, PowerCenter, and SAP NetWeaver. You should also be familiar with the interface
requirements for other supporting applications. For additional information on related SAP
issues, see the SAP documentation.
The material in this book is also available online.
Document Conventions
This guide uses the following formatting conventions:
italicized monospaced text This is the variable name for a value you enter as part of an
operating system command. This is generic text that should be
replaced with user-supplied values.
Warning: The following paragraph notes situations where you can overwrite
or corrupt data, unless you follow the specified procedure.
bold monospaced text This is an operating system command you enter from a prompt to
run a task.
xxii Preface
Other Informatica Resources
In addition to the product manuals, Informatica provides these other resources:
♦ Informatica Customer Portal
♦ Informatica web site
♦ Informatica Knowledge Base
♦ Informatica Technical Support
Preface xxiii
WebSupport requires a user name and password. You can request a user name and password at
http://my.informatica.com.
North America / South America Europe / Middle East / Africa Asia / Australia
xxiv Preface
Part I: Getting Started with
PowerCenter Connect for
SAP NetWeaver
This part contains the following chapters:
♦ Understanding PowerCenter Connect for SAP NetWeaver, 3
♦ Configuring mySAP Option, 15
♦ Configuring BW Option, 45
1
2
Chapter 1
Understanding
PowerCenter Connect for
SAP NetWeaver
This chapter includes the following topics:
♦ Overview, 4
♦ PowerCenter and mySAP Integration Methods, 5
♦ PowerCenter and BW Integration Methods, 8
♦ Communication Interfaces, 12
♦ Transport System, 13
3
Overview
SAP NetWeaver is an application platform that integrates multiple business applications and
solutions, such as Customer Relationship Management (CRM), Advanced Planner and
Optimizer (APO), and Bank Analyzer. Developers can add business logic within SAP
NetWeaver using Java 2 Enterprise Edition (J2EE) or Advanced Business Application
Programming-Fourth Generation (ABAP/4 or ABAP), a language proprietary to SAP.
PowerCenter Connect for SAP NetWeaver offers several integration methods to extract data
from or load data into SAP NetWeaver. The integration methods available depend on the
option you use:
♦ mySAP Option. You can use the ABAP, Application Link Enabling (ALE), RFC/BAPI
functions, data migration, or business content integration methods. For more information
about each integration method, see “PowerCenter and mySAP Integration Methods” on
page 5.
♦ BW Option. You can extract data from or load data to the BW Enterprise Data
Warehouse. For more information about the BW extracting and loading integration
methods, see “PowerCenter and BW Integration Methods” on page 8.
SAP NetWeaver is the basis for SAP solutions. Because PowerCenter works with the SAP
NetWeaver application platform, you can integrate PowerCenter with any SAP industry
solution or mySAP application that provides RFC/BAPI or ALE integration methods.
Figure 1-1 shows how PowerCenter integrates with SAP NetWeaver:
Integration Methods
ABAP
RFC/BAPI PowerCenter
J2EE ABAP
DMI
BW Enterprise BCI
Data Warehouse
Database Server
Operating System
BW Data Extraction
You can extract data from BW to use as a source in a PowerCenter session with the Open Hub
Service (OHS), an SAP framework for extracting data. OHS uses data from multiple BW data
sources, including InfoSources and InfoCubes. The OHS framework includes InfoSpoke
programs, which extract data from BW and write the output to SAP transparent tables.
The PowerCenter SAP BW Service listens for RFC requests from BW and initiates workflows
within PowerCenter that extract from BW. You configure the SAP BW Service in the
PowerCenter Administration Console. For more information, see “Creating and Configuring
the SAP BW Service” in the PowerCenter Administrator Guide.
Figure 1-2 shows how PowerCenter interacts with BW to extract data:
BW Server
InfoObject
InfoSpoke
InfoCube
Process Chain
Open
Hub
Service
ABAP
SAP File
Transparent
Table
PowerCenter
PowerCenter
SAP BW Service Designer
Workflow
BW Data Loading
You can use Business Application Program Interface (BAPI), an SAP method for linking
components into the Business Framework, to exchange metadata with BW. You can use BAPIs
to extend SAP Business Information to load data into BW from external systems.
You import BW target definitions into the Designer and use the target in a mapping to load
data into BW. During the PowerCenter workflow, the Integration Service uses a transfer
method to load data into BW transfer structures.
You configure transfer methods in BW to specify how to load data into a transfer structure. In
BW, data can move from a transfer structure to InfoCubes and InfoSources through
communication structures. Use transfer rules to move data to a communication structure. Use
update rules to move data from communication structures to InfoCubes and InfoSources.
Figure 1-3 shows how PowerCenter interacts with BW to load data:
BW Server Data
InfoCube Storage
InfoPackage
Operational
Metadata Data Store
Repository
IDoc
Transfer InfoSource
PSA
Staging Engine Transfer
Persistent
Data Storage
ABAP
Designer Integration
SAP BW Service
Service
PowerCenter
Update Rules
InfoCube InfoCube
Communication Structure
PSA
Staging BAPIs
Integration Service
Transport System 13
14 Chapter 1: Understanding PowerCenter Connect for SAP NetWeaver
Chapter 2
Configuring mySAP
Option
This chapter includes the following topics:
♦ Overview, 16
♦ Configuration Checklist, 18
♦ Installing and Configuring SAP, 22
♦ Renaming the sideinfo and saprfc.ini Files on Windows, 30
♦ Defining PowerCenter as a Logical System in SAP, 31
♦ Configuring the sideinfo File, 39
♦ Configuring the saprfc.ini File, 41
15
Overview
PowerCenter Connect for SAP NetWeaver mySAP Option requires configuration on both the
PowerCenter and SAP systems. The administrators for each of these systems must perform
the configuration tasks for their respective systems.
Changes to Installation
PowerCenter Connect for SAP NetWeaver mySAP Option 8.1.1 includes the following
installation changes:
♦ Setting the SIDE_INFO and RFC_INI environment variables. You no longer need to
register the SIDE_INFO and RFC_INI environment variables before installing
PowerCenter.
♦ sideinfo and saprfc.ini file name changes. When you install or upgrade PowerCenter, the
installer places the files sideinfo.infa_811 and saprfc.ini.infa_811 on the PowerCenter
Client and PowerCenter Services machines. If you want to use these files, you need to
rename them. For more information, see “Renaming the sideinfo and saprfc.ini Files on
Windows” on page 30.
Overview 17
Configuration Checklist
After you install and configure SAP and PowerCenter, use one or more of the following
methods to integrate PowerCenter with mySAP applications:
♦ Data integration using ABAP. For more information, see “Integrating with SAP Using
ABAP” on page 19.
♦ IDoc integration using ALE. For more information, see “Integrating with SAP Using
ALE” on page 20.
♦ Data integration using RFC/BAPI functions. For more information, see “Integrating with
SAP Using RFC/BAPI Functions” on page 20.
♦ Data migration. For more information, see “Migrating Data to SAP” on page 20.
♦ Business content integration. For more information, see “Integrating with SAP Business
Content” on page 20.
Configure entries in Configure the saprfc.ini file on the PowerCenter Integration Service and Client with
saprfc.ini parameters that enable communication with SAP.
RFC/ Business
Configuration Task ABAP ALE DMI
BAPI Content
Configuration Checklist 19
Integrating with SAP Using ALE
Complete the following steps to integrate with SAP using ALE:
1. Define PowerCenter as a logical system in SAP. In SAP, define PowerCenter as a logical
system. For more information, see “Creating a Logical System for IDoc ALE Integration”
on page 31.
2. Configure the saprfc.ini file. Configure an entry in saprfc.ini for RFC communication
with SAP. Configure a Type R entry to listen for outbound IDocs. Verify that the
PROGID parameter matches the Program ID you configured for the logical system in
SAP. For more information, see “Configuring the saprfc.ini File” on page 41.
3. Configure SAP_ALE_IDoc_Reader application connections. Configure to receive
outbound IDocs from SAP. For more information, see “Managing Connection Objects”
in the PowerCenter Workflow Administration Guide.
4. Configure an SAP_ALE_IDoc_Writer application connection. Configure to send
inbound IDocs to SAP. For more information, see “Managing Connection Objects” in
the PowerCenter Workflow Administration Guide.
Configuration Checklist 21
Installing and Configuring SAP
The SAP system administrator must complete the following steps for PowerCenter
integration on the development, test, and production SAP systems:
1. Delete transport programs from previous versions. For more information, see “Step 1.
Delete Transport Programs” on page 22.
2. Transport objects to the SAP system. For more information, see “Step 2. Transport
Objects” on page 24.
3. Run transported programs that generate unique IDs. For more information, see “Step 3.
Run Transport Programs” on page 26.
4. Create users in the SAP system for PowerCenter users. For more information, see “Step
4. Create Users” on page 27.
5. Create profiles in the SAP system for PowerCenter users. For more information, see
“Step 5. Create Profiles” on page 28.
6. Create a development class for the ABAP programs that PowerCenter installs on the
SAP system. Complete in the development environment only. For more information, see
“Step 6. Create a Development Class” on page 29.
Note: Some SAP systems refer to a development class as a package.
Modifying /INFATRAN/
To delete a transport object, you need to register the namespace /INFATRAN/ and enter the
repair license. Also, you need to change the status of /INFATRAN/ in SAP system to
Modifiable.
To modify /INFATRAN/:
Field Description
Namespace Role Enter P if the SAP system is a Producer system. Or, enter C if the SAP system is a
Recipient system.
P represents a namespace that you create. You can develop this namespace if you have a
valid Develop License.
C represents a namespace that you import into the SAP system. You cannot develop this
namespace. However, you can repair the namespace if you have a valid Repair License.
Repair License Enter 10357544012122787918, which is the license key to delete or modify a namespace.
4. Click Save.
5. Go to transaction SE03 and double-click Set System Change Option.
The System Change Option screen appears.
6. Change the Global Setting to Modifiable and click Save.
To delete transports:
1. Go to transaction SE10 and verify whether there are any locks on the objects under the
development class that you want to delete.
An object is locked when another user is modifying or transporting the object. Check the
list of modifiable request(s) for all users in transaction SE10 to verify if any requests are
associated with the PowerCenter object.
2. Release all the modifiable requests associated with the PowerCenter object.
3. Go to transaction SE10 and create a workbench for deleting all objects.
4. Go to transaction SE80, select the development class that you want to delete, and click
display.
For example, select the development class ZINFA_DESIGNTIME. When you select a
development class, it displays all of its objects, such as function groups, programs,
transactions, and dictionary objects. Dictionary objects include tables and structures.
1. Go to transaction STMS.
2. Click Overview > Imports.
3. Open the target system queue.
4. Click Extras > Other Requests > Add.
The Add Transport Request to Import Queue dialog box appears.
5. Add a transport request number.
When you add a transport request number, delete the prefix. For example, when you add
ZINFABC_RUN_R900101.R46, delete ZINFABC_RUN. Place the ZINFABC
run-time transport first.
Production/ Authorization
Integration Feature Activity
Development Object
Production/ Authorization
Integration Feature Activity
Development Object
ALE authorization Production B_ALE_RECV Message type of the IDoc that needs to be written
to the SAP system.
Note: When you run a session with an RFC/BAPI function mapping, you also need to include
authorization objects specific to the RFC/BAPI function you are using.
sideinfo.infa_811 sideinfo
saprfc.ini.infa_811 saprfc.ini
1. Go to transaction SALE.
The Display IMG window appears.
2. Expand the tree to locate the Application Link Enabling > Sending and Receiving
Systems > Logical Systems > Define Logical System operation.
3. Click the IMG - Activity icon to run the Define Logical System operation.
An informational dialog box appears.
4. Click Enter.
1. Go to transaction SM59.
The Display and Maintain RFC Destinations window appears.
2. Click Create.
The RFC Destination window appears.
3. For RFC Destination, enter the name of the logical system you created in “Step 1. Create
a Logical System for PowerCenter” on page 31.
For example, LSPowerCenterALE.
4. Enter T for Connection Type to create a TCP/IP connection.
5. Enter a description for the RFC destination.
6. Click Save.
The window refreshes.
7. For Activation Type, click Registration.
8. For Program ID, enter the same name that you entered for RFC Destination.
For example, LSPowerCenterALE. Use this Program ID as the value for the PROGID
parameter in the saprfc.ini file.
1. Go to transaction WE21.
2. Click Ports > Transactional RFC.
3. Click Create.
The Ports in IDoc Processing dialog box appears.
4. Click Generate Port Name, or click Own Port Name and enter a name.
5. Click Enter.
6. Enter a description for the port.
7. Select the IDoc record version type.
8. Enter the name of the RFC Destination you created in “Step 2. Create an RFC
Destination” on page 32.
For example, LSPowerCenterALE.
1. Go to transaction WE20.
2. Click Create.
3. Enter the following properties:
Partner Profile
Description
Property
Partner number Enter the name of the logical system you created for PowerCenter. For example,
LSPowerCenterALE.
Partner Profile
Description
Property
Partner Profile
Description
Property
Outbound Parameter
Description
Property
Message Type Select the IDoc message type the SAP system sends to PowerCenter.
Receiver Port Select the tRFC port number you defined in “Step 3. Create a tRFC Port for the RFC
Destination” on page 33.
IDoc Type Select the IDoc basic type of the IDocs the SAP system sends to PowerCenter.
3. Click Save.
The Packet Size property appears.
Inbound Parameter
Description
Property
Message Type Select the IDoc message type the SAP system receives from PowerCenter.
Process Code Select the process code. The SAP system uses the process code to call the
appropriate function module to process the IDocs it receives.
9. Click Enter.
10. Repeat steps 7 to 9 to create an inbound parameter for each IDoc message type that SAP
receives from PowerCenter.
For more information, see “Configuring the saprfc.ini File” on page 41.
Note: The logical system created in SAP when you ran the ZINFABCI transaction for
PowerCenter Connect for SAP R/3 version 7.x is still valid. Do not run the ZINFABCI
transaction in version 8.0.
1. Open the sideinfo file in the location specified in Table 2-4 on page 30.
2. Define the following parameters in the sideinfo file:
LU Host name of the machine where the SAP application server is running.
PROTOCOL Set to I.
3. If you are connecting to multiple SAP systems, create one entry for each system in the
sideinfo file with unique DEST parameters.
Check with the SAP administrator for the port numbers of the gateway service and the
dispatcher service.
Data Business
Entry Type ABAP ALE RFC/BAPI
Migration Content
Type A/
saprfc.ini
Type B/ Description
Parameter
Type R
DEST All Logical name of the SAP system for the connection.
For the ABAP integration method, set equal to the DEST entry in sideinfo.
For SAP versions 4.6C and higher, use up to 32 characters. For earlier versions,
use up to eight characters.
All DEST entries must be unique. You must have only one DEST entry for each
SAP system.
TYPE All Type of connection. Set to A, B, or R. For a list of the entry types required for each
integration method, see Table 2-5 on page 41.
ASHOST A Host name or IP address of the SAP application. PowerCenter uses this entry to
attach to the application server.
PROGID R Program ID. The Program ID must be the same as the Program ID for the logical
system you define in the SAP system to send or receive IDocs or to consume
business content data. For business content integration, set to INFACONTNT. For
more information, see “Defining PowerCenter as a Logical System in SAP” on
page 31.
3. If you are connecting to multiple SAP systems, create an entry for each system in the
saprfc.ini file with unique DEST parameters.
Configuring BW Option
45
Overview
PowerCenter Connect for SAP NetWeaver BW Option requires configuration on both the
PowerCenter and SAP BW systems. The administrators for each of these systems must
perform the configuration tasks for their respective systems.
Changes to Installation
PowerCenter Connect for SAP NetWeaver BW Option includes the following installation
changes:
♦ Setting the RFC_INI environment variable. You no longer need to register the
SIDE_INFO and RFC_INI environment variables before installing PowerCenter.
♦ saprfc.ini file name change. When you install PowerCenter, the installer places the
saprfc.ini.infa_811 file on the PowerCenter Client and PowerCenter Services machines. If
you want to use this file, you need to rename it. For more information, see “Renaming the
saprfc.ini File on Windows” on page 51.
Configuration Checklist
Complete the following steps to configure BW Option:
1. Create profiles for development and production users. For more information, see
“Creating Profiles for Production and Development Users” on page 48.
2. Rename the saprfc.ini file on Windows. If you are installing PowerCenter for the first
time on Windows, you need to rename the connection file to saprfc.ini. If you are
upgrading from a previous version, and you want to overwrite the existing saprfc.ini file,
you need to rename the connection file. For more information, see “Renaming the
saprfc.ini File on Windows” on page 51.
Upgrading BW Option
Complete the following steps if you are upgrading from a previous version of PowerCenter
Connect for SAP NetWeaver BW Option:
1. Make a copy of the saprfc.ini file.
2. Uninstall the previous version of PowerCenter Connect for SAP NetWeaver BW Option
according to the following guidelines:
♦ If you are upgrading from PowerCenter Connect for SAP BW 7.x or earlier, uninstall
PowerCenter Connect for SAP BW.
♦ If you are upgrading from PowerCenter Connect for SAP NetWeaver BW Option 8.x,
follow the instructions in the PowerCenter Installation and Configuration Guide to
upgrade PowerCenter. When you upgrade to the current version of PowerCenter, you
also upgrade to the latest version of PowerCenter Connect for SAP NetWeaver BW
Option.
3. Uninstall PowerCenter Integration Server for SAP BW (PCISBW) according to the
following guidelines:
♦ If you are upgrading from PowerCenter Connect for SAP BW 7.x or earlier, uninstall
PowerCenter Integration Server for SAP BW (PCISBW).
♦ If you are upgrading from PowerCenter Connect for SAP NetWeaver BW Option 8.x,
you can skip this step. When you upgrade to the current version of PowerCenter, you
also upgrade to the current version of PowerCenter Connect for SAP NetWeaver BW
Option, which includes the BW Service.
4. Install the current version of PowerCenter. When you install the current version of
PowerCenter, you also upgrade to the latest version of PowerExchange for SAP
NetWeaver.
Note: PowerCenter Connect for SAP BW is renamed to PowerCenter Connect for SAP
NetWeaver BW Option. Informatica PowerCenter Integration Server for SAP BW
(PCISBW) is renamed to SAP BW Service.
Overview 47
Creating Profiles for Production and Development Users
The SAP administrator needs to create the following product and development user
authorization profiles.
Table 3-1 shows the authorization profile configuration for a development user (BW write):
Table 3-3 shows the authorization profile configuration for a development user (BW read):
Table 3-4 shows the authorization profile configuration for a production user (BW read):
5. Click Enter.
6. In the Create Source System dialog box, enter the following information and click Enter:
Parameter Description
Logical System Name Name of the logical system. For example, you can enter LSPowerCenterBW.
PowerCenter Client Type A or Import transfer structures from BW. Use the DEST entry in the
Type B Import BW Transfer Structure dialog box.
SAP BW Service Type R Register as RFC server. Receive requests to run sessions. Use the
DEST entry when you create the SAP BW Service.
Type A/
saprfc.ini
Type B/ Description
Parameter
Type R
DEST All Destination in RFCAccept. For SAP versions 4.6C and higher, use no more than
32 characters. For earlier versions, use no more than 8 characters. All DEST
entries must be unique. You must have only one DEST entry for each BW
system.
Use this parameter for the Type A or Type B entry as the connect string when
you import an InfoSource in the Target Designer and when you configure
database connections in the Workflow Manager.
TYPE All Set to A, B, or R. Specifies the type of connection. For the entry types required
for each PowerCenter component, see Table 3-6 on page 53.
PROGID R Program ID for the logical system you create in BW for the SAP BW Service, as
described in “Defining PowerCenter as a Logical System in SAP BW” on
page 52. The Program ID in BW must match this parameter, including case.
3. If you are connecting to multiple BW systems, create multiple entries in the saprfc.ini file
with unique DEST parameters.
Required/
Property Description
Optional
Service Name Required Name of the SAP BW Service. The name is not case sensitive and must be
unique within the domain. The characters must be compatible with the code
page of the associated repository. The name cannot have leading or trailing
spaces, include carriage returns or tabs, exceed 79 characters, or contain the
following characters:
\/*?<>"|
Location Required Name of the domain and folder in which the SAP BW Service is created. The
Administration Console creates the SAP BW Service in the domain where you
are connected. Click Select Folder to select a new folder in the domain.
Domain for Required Domain that contains the Integration Service associated with the SAP BW
Associated Service.
Integration If the PowerCenter environment consists of multiple domains, you can click
Service Manage Domain List to access another domain. You can then associate the
SAP BW Service with an Integration Service that exists in another domain.
For more information about working with multiple domains, see “Managing the
Domain” in the PowerCenter Administrator Guide.
Required/
Property Description
Optional
SAP Destination Required Type R DEST entry in the saprfc.ini file created for the SAP BW Service. For
R Type more information about the saprfc.ini file, see “Configuring the saprfc.ini File”
on page 53.
3. Click OK.
A message informs you that the SAP BW Service was successfully created.
4. Click Close.
The SAP BW Service properties window appears.
5. Click Enable to start the SAP BW Service.
Before you enable the SAP BW Service, you must define PowerCenter as a logical system
in SAP BW. For more information, see “Defining PowerCenter as a Logical System in
SAP BW” on page 52.
For more information about configuring the SAP BW Service, see “Creating and
Configuring the SAP BW Service” in the PowerCenter Administrator Guide.
61
62
Chapter 4
63
Overview
When you import source definitions from SAP, the Designer uses RFC to connect to the SAP
application server. The Designer calls functions in the SAP system to import source
definitions. SAP returns a list of definitions from the SAP dictionary. You can select multiple
definitions to import into the PowerCenter repository. The Designer imports the definition as
an SAP R/3 source definition. After you import a definition, use it in a mapping to define the
extract query.
You can import the following definitions into the PowerCenter repository:
♦ SAP tables and views. SAP tables include transparent, pool, and cluster tables. In
addition, you can extract data from database views in SAP.
♦ SAP hierarchies. A hierarchy is a tree-like structure that defines classes of information.
♦ SAP IDocs. An IDoc is a generated text file that contains a hierarchical structure
consisting of segments.
If the source changes after you import the definition, reimport the definition as a new SAP
R/3 source definition.
Table Name
MANDT P MANDT P
EBELP P LPONR F
KONNR F
ANFNR F
Note: SAP does not always maintain referential integrity between primary key and foreign key
relationships. If you use SAP R/3 source definitions to create target definitions, you might
encounter key constraint errors when you load the data warehouse. To avoid these errors, edit
the keys in the target definition before you build the physical targets.
MANDT P MANDT P
EBELP P
Uniform Hierarchies
A hierarchy is uniform if each level in the hierarchy represents the same type of information.
Figure 4-4 shows a sample uniform hierarchy:
In this example, the root node represents the company name, level one represents Division,
and level two represents Employee ID.
Table 4-2 shows the join of the sample hierarchy in Table 4-7 on page 72 and the detail table
in Table 4-1. The shaded columns represent data extracted from the detail table:
Tech Co. TECH-ID Sales SLS-ID 125 150 125 John Franks
Tech Co. TECH-ID Sales SLS-ID 125 150 126 Angela Michaelson
Tech Co. TECH-ID Sales SLS-ID 125 150 149 Jose Fernandez
Tech Co. TECH-ID Sales SLS-ID 200 200 200 Jackson Green
Tech Co. TECH-ID R&D RD-ID 325 335 326 Jeff Katigbak
Tech Co. TECH-ID R&D RD-ID 325 335 334 Tracy Scott
Tech Co. TECH-ID R&D RD-ID 336 400 340 Betty Faulconer
Tech Co. TECH-ID R&D RD-ID 413 435 413 Sandy Jackson
Tech Co. TECH-ID R&D RD-ID 450 500 450 Aaron Williams
Hierarchy Definitions 69
Non-Uniform Hierarchies
A hierarchy is non-uniform if one or more of the branches do not have the same number of
levels.
Figure 4-5 shows a sample non-uniform hierarchy:
In this example, there is a level between Division and Employee that represents Department.
The Sales Division branches directly to Employee.
When the Integration Service extracts data for this hierarchy, it inserts NULLs in the level
two and SetID columns associated with Sales. Some nodes do not have descriptions, but all
nodes have SetIDs. If a node does not have a description, the Integration Service inserts
NULLs in that column, and it extracts the SetID into the associated SetID column.
Table 4-3 shows the join of the sample hierarchy in Figure 4-5 on page 70 and the detail table
in Table 4-1 on page 69. The shaded columns represent data extracted from the detail table:
Tech Co. TECH-ID Sales SLS-ID NULL NULL 125 150 125 John Franks
Tech Co. TECH-ID Sales SLS-ID NULL NULL 125 150 126 Angela
Michaelson
Tech Co. TECH-ID Sales SLS-ID NULL NULL 125 150 149 Jose
Fernandez
Tech Co. TECH-ID Sales SLS-ID NULL NULL 200 200 200 Jackson
Green
Tech Co. TECH-ID R&D RD-ID Research RSCH-ID 325 335 326 Jeff Katigbak
Tech Co. TECH-ID R&D RD-ID Research RSCH-ID 325 335 334 Tracy Scott
Tech Co. TECH-ID R&D RD-ID Research RSCH-ID 336 400 340 Betty
Faulconer
Tech Co. TECH-ID R&D RD-ID NULL <NULL>- 413 435 413 Sandy
ID Jackson
Tech Co. TECH-ID R&D RD-ID NULL <NULL>- 450 500 450 Aaron
ID Williams
Set ID
Business Name
Hierarchy Definitions 71
After you import a hierarchy definition, the Designer creates the following columns:
♦ The root node and SetID. The Designer creates two columns for the hierarchy root, one
column for the root node, and one for the SetID of the root node.
♦ Each node level and SetID. The Designer creates two columns for each level representing
higher nodes in the hierarchy, one column for each node level and the SetID of the node
level.
♦ Detail range for the leaf nodes. The Designer creates two columns to represent the range
of values for the leaf nodes in the hierarchy. These columns are named FROM_VALUE
and TO_VALUE.
For example, when you import a hierarchy with the structure as shown in Figure 4-7 on page
72, the Designer creates a definition as shown in Figure 4-8 on page 72.
Figure 4-7 shows a sample hierarchy structure:
Higher Nodes
Leaf Nodes
Figure 4-8 shows the definitions the Designer creates from the hierarchy structure in
Figure 4-7:
Root Node
Leaf Nodes
When you import definitions of related tables, the Designer imports the key relationships.
However, when you import a hierarchy and its detail table, you create a logical relationship.
Hierarchy Definitions 73
The detail table contains the detail information for the leaf nodes. The hierarchy table
contains the range of values for the details.
IDoc Definitions 75
Figure 4-10 shows the Import SAP Metadata dialog box where you import IDoc definitions:
IDoc Segments
You can import an entire IDoc or an individual segment of an IDoc. When you import an
entire IDoc, the Designer imports every segment in the IDoc. After you import an entire
IDoc, each segment in the IDoc is independent of the other segments in the IDoc.
Control
Information
The Designer also displays the following IDoc type properties on the Properties tab of the
IDoc definition:
♦ IDoc Type. The name of the IDoc definition.
♦ Basic IDoc Type. The name of the basic IDoc type.
♦ Extension IDoc Type. The name of the user-defined extension of a basic IDoc type.
IDoc Definitions 77
Importing a Source Definition
When you import source definitions, you connect to the SAP system through the Import SAP
Metadata dialog box. The Designer provides the following tabs in the Import SAP Metadata
dialog box:
♦ Tables. Import table and view definitions.
♦ Hierarchies. Import hierarchy definitions.
♦ IDocs. Import IDoc definitions.
You can also enter filter criterion to reduce the number of definitions the Designer displays in
the selection list. If the first character of an SAP source name is an asterisk (*) or a number,
the Designer converts the first character to an underscore (_) when you import the source
definition.
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name must be a user for
which you have created a source system connection.
Language Required Language you want for the mapping. The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
Select Tables,
Hierarchies, or
IDocs.
7. If you are importing table definitions, clear All Keys if you want to import a subset of all
key relationships.
For more information, see “Importing Key Relationships” on page 66.
8. Select the object or objects you want to import.
♦ Hold down the Shift key to select blocks of sources.
1. In the Source Analyzer, double-click the title bar of the source definition.
Although you can edit column names in the Source Analyzer, the column names of the
source definition need to match the column names of the SAP source to generate an
ABAP program.
5. Click the Metadata Extensions tab.
6. Optionally, create user-defined metadata extensions.
Use metadata extensions to associate information with repository metadata. For more
information about metadata extensions, see “Metadata Extensions” in the PowerCenter
Repository Guide.
7. Click OK.
Business Component
Related Definitions
1. In the Source Analyzer, double-click the title bar of the hierarchy definition.
The Edit Tables dialog box appears.
2. Click Rename.
When I view the properties of an imported SAP R/3 table definition, I see the number sign (#)
for several characters in the table description.
The Designer displays the number sign (#) for each character that is not converted while
importing metadata from SAP. If the Integration Service is running in Unicode mode, the
conversion error probably occurred because the imported table does not have a description in
the connection language you selected in the Import SAP Metadata dialog box. Log in to the
SAP system and enter a table description for this language.
Troubleshooting 85
86 Chapter 4: Importing SAP R/3 Source Definitions
Chapter 5
87
Overview
Complete the following steps when you create a mapping with an SAP R/3 source:
1. Configure the source definition. The source definition in a mapping has the following
configuration properties that allow you to optimize performance when you extract from
SAP:
♦ Select option. Restricts the number of rows returned from the SAP R/3 source.
♦ Order By. Orders by primary keys or by a specific number of ports.
2. Create and configure the Application Source Qualifier. For more information about the
Application Source Qualifier, see “Application Source Qualifier for SAP R/3 Sources” on
page 119.
3. Install the ABAP program. Install an ABAP program that extracts source data when you
run a workflow.
Use the following guidelines when you create a mapping with an SAP R/3 source:
♦ The mapping name cannot exceed 56 characters.
♦ The mapping name or description and the folder or repository name in which you save the
mapping cannot contain the word “REPORT.” When the word “REPORT” exists in the
mapping name or description and the folder or repository name, the ABAP program fails.
For more information about open and exec SQL, see “Generating an ABAP Program” on
page 121.
Select Single
Select Single is an open SQL command that returns one row from the SAP R/3 source. When
you use open SQL, the Designer generates a Select Single statement for each source definition
you configure with the Select Single option.
You might use this option with a nested loop join when you want to join tables based on key
values in one table. The inner loop with Select Single matches one record for each join
condition. Select Single improves performance in the select loop by selecting a single row of
data rather than the entire table.
Select Single is not available with the following options:
♦ Exec SQL and ABAP join syntax. Exec SQL and ABAP join syntax do not recognize Select
Single. Therefore, the Designer does not generate a Select Single statement in the ABAP
program if you configure the Application Source Qualifier to generate exec SQL or ABAP
join syntax.
♦ Order By. If you configure the source definition to use Select Single and Order By, the
Designer generates a Select Single statement in the ABAP program, but it does not
generate an Order By statement.
♦ Hierarchy and IDoc definitions. The Select Single option is not available for hierarchy
and IDoc definitions.
Transparent Tables
When you specify the number of Order By ports for a source definition in a mapping, the
Designer generates an Order By clause beginning with the first column in the definition. The
Designer generates the Order By statement using the following guidelines:
♦ If you specify a number of ports to Order By that is greater than the number of ports in
the source definition, the ABAP program generates an Order By statement using all ports
in the source definition.
♦ SAP requires all columns in the Order By statement to be part of the select statement. If
you include a column in the Order By selection, but you do not project it into the
Application Source Qualifier, the ABAP program adds that column to the select statement.
The ABAP program does not, however, extract the data from the column you excluded
from the Application Source Qualifier.
The Order By statement is different for exec SQL, open SQL, and ABAP join syntax. The
following samples are based on the same mapping that joins KONH and KONP in one
Application Source Qualifier. Each source definition is configured to order by three ports.
Exec SQL
When you use exec SQL, the Order By statement is similar to standard relational statements:
exec sql [...]
INTO [...]
endexec.
into [...]
into [...]
from KONP
where [...]
Note: The Designer does not generate an Order By clause if you use select single in the source
properties.
INTO [...]
FROM KONH
ON
KONP~KNUMH = KONH~KNUMH
WHERE
In the Source Analyzer, you can import an entire IDoc or an individual segment of an IDoc. If
two different IDocs have segments with the same name, you can edit the IDoc Type to
indicate which segment you want to use in the mapping.
For example, the IDocs E1BPACAR01 and E1BPACAR02 both have a segment named
E1MVKEM. You import E1MVKEM from E1BPACAR01 in the Source Analyzer. You
cannot import E1MVKEM twice in the Source Analyzer. To use the E1MVKEM segment
from E1BPACAR02, change the IDoc Type in the Mapping Designer to E1BPACAR02.
Mapping Version
If a mapping becomes invalid after you install the program, validate the mapping, save the
repository, and re-install the ABAP program. If you save the repository with the mapping
open after you install the ABAP program, the session fails, and the session log instructs you to
regenerate and install the program.
Select Old/Deleted
Mapping Versions:
Select to uninstall
ABAP programs from
all old and deleted
mapping versions.
For more information about uninstalling ABAP programs, see “Uninstalling the ABAP
Program and Viewing Program Information” on page 102.
Generating the ABAP Program and Installing Directly onto the SAP
System
You can install the ABAP program directly onto the SAP system. When you install directly
onto the SAP system for the first time, the Designer generates a program name.
If you are generating the ABAP program for the first time, you can override the generated
program name by selecting Enable Override. You cannot override the local file name.
To generate the ABAP program and install it directly onto the SAP system:
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
Language Required Language in which you want to receive messages from the SAP system
while connected through this dialog box. If you leave this option blank,
PowerCenter connects using the default language of the SAP system. For
more information about the language codes for SAP, see “Language Codes,
Code Pages, and Unicode Support” on page 395.
3. Click Connect.
A list of mappings in the folder appears. If the mappings are versioned, the version
number appears next to each version of the mapping.
Use the following guidelines when you select the mappings for which you want to install
ABAP:
♦ You can install ABAP for all versions of a mapping.
♦ You can install ABAP for different versions of the same mapping.
♦ You can install ABAP for all versions of multiple mappings.
5. Select the Program Mode: File or Stream.
6. Optionally, select Enable Override to override the default ABAP program name.
7. Optionally, select Use Namespace to prefix a namespace you registered with SAP to the
ABAP program name.
8. In the Development Class box, enter the name of the development class where you want
to install the program.
The default development class is $TMP.
Note: The $TMP development class is temporary. You cannot transport ABAP programs
from this class to another system.
Install ABAP programs that use a namespace in a development class that is in the same
namespace.
9. Click Direct Installation.
10. Enter the ABAP program name if you selected Enable Override. If you want to use a
namespace, enter the namespace as a prefix to the ABAP program name.
This step also generates the ABAP program.
4. To view the local ABAP copy, click View File and double-click the file name.
5. Click Install From Files.
6. Double-click the ABAP file from the Open ABAP File dialog box.
5. Click Uninstall.
The output window displays a message indicating the program was successfully
uninstalled:
Program YEKKO_99 un-installed successfully from destination sophie.
3. Expand the SAP system node to select the orphan or shared ABAP program you want to
clean.
4. Click Uninstall.
5. Click Close.
107
Overview
SAP functions are generic modules in the SAP system. The SAP system has a set of standard
functions and user-defined functions. SAP functions can accomplish simple tasks such as
getting the name of a field, or complex tasks such as calculating depreciation.
If the mapping requires an ABAP program to extract data from SAP, you can insert SAP
functions in the ABAP program flow dialog box in the Application Source Qualifier to
customize how the ABAP program extracts data. When you run a workflow, the ABAP
program calls the SAP function to perform tasks.
To use SAP functions in the ABAP program, you first import the functions in the Source
Analyzer and then insert them in the ABAP program flow. In the ABAP program flow, you
assign values to function parameters so the function can perform calculations. You then assign
variables to hold the result of the function output.
After you customize the ABAP program, you generate and install the ABAP program. The
Designer generates a CALL FUNCTION statement in the ABAP program to use the SAP
function.
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name must be a user for
which you have created a source system connection.
Language Required Language you want for the mapping. The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
Figure 6-2. SAP Function Parameters in the ABAP Program Flow Dialog Box
Note: Any error handling options you set in the session properties for the ABAP mappings do
not apply to errors returned by SAP functions you use in the ABAP program flow of the
mappings.
You can choose from variables already defined in the program flow. The Designer shows a
list of variables that has matching datatype, precision, and scale with the parameter. You
can also click in the value field and enter the name of a new variable. The Designer
creates the new variable when you enter the new variable name.
4. From the Scalar Output tab, assign variables for scalar output parameters.
5. Select SQ Port if you want the Designer to create an output port for the import
parameter.
6. From the Changing tab, assign variables for changing parameters.
You do not need to assign variables for optional changing parameters.
7. Select SQ Port if you want the Designer to create an output port for the changing
parameter.
Table 6-1. Rules for Inserting an SAP Function in the ABAP Program Flow
Exec SQL - You can insert SAP functions before the first source or after the last source. You cannot
insert SAP functions between sources in the program flow.
- If you insert an SAP function before the first source, the Designer calls the function
before the exec statement.
- If you insert an SAP function after the last source, the Designer inserts the function
after the FORM WRITE_DSQNAME_TO_FILE statement.
ABAP join syntax - You can insert SAP functions before the first source or after the last source. You cannot
insert functions between sources in the program flow.
- If you insert an SAP function before the first source, the Designer calls the function
before the select statement.
- If you insert an SAP function after the last source, the Designer inserts the function
after the WHERE clause.
Open SQL (nested loop) - You can insert SAP functions between sources in the program flow. The Designer
inserts SAP functions between the select statements. You can insert SAP functions
before the first source or after the last source.
- If you insert an SAP function before the first source, the Designer calls the function
before the first select statement.
- If you insert an SAP function after the last source, the Designer inserts the function
after the last WHERE clause.
119
Overview
When you add an SAP R/3 source definition to a mapping, connect it to an Application
Source Qualifier transformation. The Application Source Qualifier represents the record set
queried from the SAP R/3 source when you run a session.
When you complete the mapping, generate and install the ABAP program that the SAP
application server uses to extract source data.
The Designer generates an ABAP program based on the properties in the source definition
and the Application Source Qualifier. The Designer can generate open SQL, exec SQL, or
ABAP join syntax. You can also set tracing level for session processing.
In the Application Source Qualifier, you can customize the ABAP program in the ABAP
Program Flow dialog box. The ABAP Program Flow dialog box shows the order that the
ABAP program processes objects. You can configure properties such as filter condition and
join condition in the ABAP Program Flow dialog box.
In the ABAP Program Flow dialog box, use static or dynamic filters to specify how the ABAP
program selects rows. You can also add more functionality to the ABAP program by adding
ABAP code blocks. You can create variables and import SAP functions to use in the ABAP
code or filter condition.
If you join multiple sources with one Application Source Qualifier, you can specify how the
ABAP program joins the source tables. You can also specify the order that the ABAP program
selects the source tables.
Available SQL
Condition
Generation Mode
Mapping contains only transparent tables and you connect to an SAP 3.x system. Open SQL
Exec SQL
Mapping contains only transparent tables and you connect to an SAP 4.x system. ABAP join syntax
Open SQL
Exec SQL
Mapping contains IDocs and you connect to an SAP 3.x system. Open SQL
Mapping contains IDocs and you connect to an SAP 4.x system. ABAP join syntax
Open SQL
In the Application Source Qualifier, the Designer does not check whether you selected the
correct ABAP generation mode for the mapping. When you connect to an SAP system and
generate the ABAP program, the Designer validates that the selected ABAP generation mode
matches the condition of the mapping and the SAP system version.
A hierarchy is a structure of metadata and is not accessible through SQL. The Designer does
not generate an ABAP program to extract data from hierarchies. The Integration Service
makes a remote function call to the application server to extract the hierarchy metadata.
into (T685-MANDT,T685-KVEWE,T685-KAPPL,T685-KSCHL,T685-KOZGF,
T685-DATVO,T685-DTVOB)
endselect.
When you join several sources in one Application Source Qualifier, open SQL uses a nested
loop to select data. The Designer issues multiple SELECT statements and then generates
WHERE clauses for the join conditions within the nested loop. For more information about
joining sources, see “Joining Source Data” on page 127.
1. With a mapping open, double-click the title bar of an Application Source Qualifier
transformation.
The Edit Transformations dialog box appears.
2. Click the Properties tab.
3. Select Force Nested Loop.
4. Click OK.
T685.DATVO, T685.DTVOB
INTO
:T685-DATVO, :T685-DTVOB
FROM
T685
where [...]
endexec.
Note: Exec SQL is not available for pool or cluster tables, hierarchies, or IDocs.
1. In the Edit Transformations dialog box for the Application Source Qualifier, click the
Properties tab.
2. Select Exec SQL.
3. Click OK.
The ABAP program selects tables and objects according to the order in the program flow.
Select a table in the program flow to configure the filter condition, join type, and join
condition for the selected table. You can complete the following tasks in the ABAP Program
Flow dialog box:
♦ Change the order of the program flow. Use the up and down arrows to arrange the order
of objects in the program flow.
♦ Insert SAP functions. You can insert SAP functions in the program flow after you have
imported them in the Source Analyzer. For more information about inserting SAP
functions, see “Working with SAP Functions in ABAP Mappings” on page 107.
♦ Create and insert ABAP code blocks in the program flow. You can add new functionality
to the program flow by inserting more blocks of ABAP code. Use ABAP program variables
in the code block. For more information about ABAP code blocks, see “Creating an ABAP
Code Block” on page 135.
♦ Create ABAP program variables. You can create ABAP program variables to represent
values in the ABAP program. The ABAP program variables can also represent SAP system
from KONH
where
(KONH_clause)
from KONP
(KONP_clause)
endselect. [...]
The following sample ABAP program is generated using open SQL to join KONH and
KONP with an outer join:
select KNUMH MANDT [...] LICDT
from KONH
where
(KONH_clause)
from KONP
where
(KONP_clause)
endselect.
if sy-subrc <> 0.
endif.
endselect. [...]
FROM
where
KONM.KNUMH = KONP.KNUMH
endexec.
FROM KONP
ON KONH~KNUMH = KONP~KNUMH
WHERE
(KONP_clause) and
(KONH_clause)
endselect.
You can also choose outer join for the join type. The following sample of the ABAP program
is generated to join KONP and KONH using an outer join:
SELECT KONH~MANDT [...]
ON KONH~KNUMH = KONP~KNUMH
WHERE
(KONP_clause) and
(KONH_clause)
endselect.
To use ABAP join syntax, clear both the Force Nested Loop option and the Exec SQL option
in the Application Source Qualifier transformation. You cannot use ABAP join syntax to join
hierarchies.
When you have a mapping with inner/outer joins between multiple SAP R/3 tables and you
use a dynamic filter on an outer join table, the session fails. This occurs due to ABAP join
restrictions on certain combinations of inner and outer joins between SAP tables. SAP
produces the following error message:
[CMRCV: 18 Illegal access to the right-hand table in a LEFT OUTER JOIN].
1. Double-click the title bar of an Application Source Qualifier transformation and select
the Properties tab.
2. Select how you will generate the ABAP program: Exec SQL, open SQL (Force Nested
Loop), or ABAP join syntax. To select ABAP join syntax, clear both Exec SQL and Force
Nested Loop.
4. Select the Join Type tab in the ABAP Program Flow dialog box.
5. Select a table from Program Flow and choose the join type (inner or outer).
If you generate the ABAP program using exec SQL, only choose inner join in the Join
Type tab.
6. Select the sources you want to join in Source(s) to Join to.
♦ If you want to enter two statements in the join condition, separate the statements with a
semicolon (;) or AND.
1. In the ABAP Program Flow dialog box, select the Join Condition tab.
2. Select the table in Program Flow to edit the join condition.
You can view a list of field names by double-clicking a table name in Source Level
Attributes.
1. In the ABAP Program Flow dialog box, click New ABAP Block.
2. Enter the name for the new ABAP code block and click OK.
The name of the ABAP code block cannot exceed 28 characters.
3. Expand the source table name and the Variables folder to view the source fields and
variable names.
4. Double-click the source field or the variable name to enter it in the ABAP code block.
Exec SQL - You can insert code blocks before the first source or after the last source. You cannot
insert code blocks between sources in the program flow.
- If you insert a code block before the first source, the Designer inserts the code block
before the exec statement.
- If you insert a code block after the last source, the Designer inserts the code block after
the FORM WRITE_DSQNAME_TO_FILE statement.
ABAP join syntax - You can insert code blocks before the first source or after the last source. You cannot
insert code blocks between sources in the program flow.
- If you insert a code block before the first source, the Designer inserts the code block
before the select statement.
- If you insert a code block after the last source, the Designer inserts the code block after
the where clause.
Open SQL (nested loop) - You can insert code blocks between sources in the program flow. The Designer inserts
code blocks between the select statements. You can insert code blocks before the first
source or after the last source.
- If you insert a code block before the first source, the Designer inserts the code block
before the first select statement.
- If you insert a code block after the last source, the Designer inserts the code block after
the last where clause.
The following rules apply when you create a code block to specify the initial value of ABAP
program variables:
♦ If you create a code block to initialize variables used in a filter condition, insert the code
block before the first source.
♦ If you create a code block to initialize variables used in data movement, insert the code
block after the last source.
Naming Convention
When you create a variable in the ABAP program flow, be aware of the following rules:
♦ Variable names cannot contain reserved words and special characters such as the pound
sign (#).
♦ Variable names are not case sensitive.
♦ The maximum length for variable names is 25 characters.
♦ The maximum length for the definition of structure or structure field variables is 30
characters.
When you generate and install the ABAP program, the SAP system validates the ABAP
program variable according to the following rules:
♦ The maximum length for the initial value of a variable is 40 characters.
♦ Variable names cannot contain any SAP datatype names, table, structure, and structure
field names, or any ABAP keyword.
The structure AENVS has a field called EXIST. You can create a structure field variable called
field1 to represent this field. The Designer generates the following statement in the ABAP
program to declare field1:
data: FIELD1 like AENVS-EXIST.
After you create the structure field variable, you specify its initial value in an ABAP code
block.
1. In the ABAP Program Flow dialog box, click Variables to open the ABAP Program
Variables dialog box.
2. Click Add.
3. Enter a name for the new ABAP program variable.
4. Choose Structure for the Variable Category and click OK.
5. Enter a definition for a structure variable.
The variable definition must be the name of an existing structure in the SAP system. You
cannot specify an initial value for a structure variable.
6. Click OK.
5. Click OK.
Variable_Name = ‘Initial_Value’.
For example, you create an ABAP type variable named var1 to represent a currency value. If
you specify zero for the initial value and one for precision and scale, the Designer generates
the following statement in the ABAP program:
data: var1(1) type P decimals 1.
var1 = ‘0’.
The ABAP statement does not include precision and scale if you choose a datatype that has
fixed precision and scale.
1. In the ABAP Program Flow dialog box, click Variables to open the ABAP Program
Variables dialog box.
2. Click Add.
3. Enter a name for the new ABAP program variable.
4. Choose ABAP Type for the Variable Category and click OK.
5. Choose a datatype for Type from the list.
6. Enter the precision and scale for the variable.
You cannot edit these fields if you choose a datatype that has fixed precision and scale.
If you select DEC, the precision must be 14 or less.
7. Enter the initial value for the variable.
8. Select SQ Port if you want the Designer to create an output port for this variable.
9. Click OK.
You can then use sysvar1 to represent sy-datum in the ABAP program flow. You cannot enter
an initial value for system variables in the ABAP Program Variables dialog box. You can
assign initial values for system variables in the ABAP code block. For more information about
creating an ABAP code block, see “Creating an ABAP Code Block” on page 135. For more
information about creating a filter condition, see “Entering a Source Filter” on page 141.
Filter condition Use the following on the right side of the filter Use the following on the right side of the filter
condition: condition:
- Constants - Constants
- User-defined mapping variables and - User-defined mapping variables and
parameters parameters
- Built-in mapping variables - ABAP program variables
Filter stored Designer stores the filter condition in Designer writes the filter condition to the
repository. ABAP program.
Filter processing When you run the workflow, the Integration When you run the workflow, the SAP server
Service moves the filter from the repository to processes the filter directly from the ABAP
the SAP system. The ABAP program calls a program.
function to process the filter.
Session properties You can override the filter condition in session You cannot override the filter condition in
properties. session properties.
Session log file or Integration Service includes the filter syntax in Integration Service does not include the filter
Log Events window log events. syntax in log events.
in the Workflow
Monitor
When you specify a port in a dynamic filter condition, static filter condition, or join override,
link that port from the Application Source Qualifier to the target or next transformation in
the mapping. Do the same if you specify a dynamic filter condition at the session level.
If you have a Unicode SAP system, the Designer is connected to a Unicode repository, and
you enter a source filter that contains ISO 8859-1 or multibyte characters, generate a local
copy of the ABAP program and upload the generated file into the SAP system. For more
information, see “Generating and Uploading into the SAP System” on page 101.
Note: You cannot use a source filter if the ABAP program flow contains a hierarchy and no
other sources.
1. In the ABAP Program Flow dialog box, select the table you want to filter.
2. Select the Dynamic Filter tab.
You can view a list of field names in the table by double-clicking the table name in
Source Level Attributes.
− If you use variables in a static filter condition, the filter condition must be in the
following format:
table_name-field_name [=, >=, <=, <, >, < > ] :variable_name
− If you use fields from another table in a static filter condition, the filter condition must
be in the following format:
table_name1-field_name [=, >=, <=, <, >, < > ] table_name2-field_name
Note: The left side of the filter condition must be a field from the table you selected in the
ABAP program flow.
♦ Enclose constants on the right side of the condition in single quotes.
♦ If you use an ABAP program variable in a static filter condition, a colon (:) must precede
the variable name.
♦ Filter conditions for character strings must match the full precision of the column if the
condition is numeric. For example, if you want to filter records from CSKS where KOSTL
is greater than 4,000, enter the following condition:
KOSTL > ‘0000004000’
1. In the ABAP Program Flow dialog box, select the table you want to filter.
2. Select the Static Filter tab.
3. In Source Level Attributes, double-click on the table name or the Variables folder to view
a list of fields or ABAP program variables.
4. Double-click the field names or variables to enter it into the filter condition. If you use a
variable in a static filter condition, enter a colon (:) before the variable name.
You can also use fields from other tables in the ABAP program flow in the right side of
the filter condition.
5. Finish entering the filter condition.
6. Click Validate to validate the syntax of the filter condition.
7. Click OK.
To update the beginning of the time period for the next session, you set the $$FROMTIME
variable to the session start time of the current session. The built-in system variable
In an ABAP code block, use a mapping variable as a constant in the right-hand side of
comparisons. You cannot modify the mapping variable by assigning a value to it. For
example, you cannot assign a value to a mapping variable in the ABAP code block. For more
information about mapping variables, see “Mapping Parameters and Variables” in the
PowerCenter Designer Guide. For more information about system variables, see “Variables” in
the PowerCenter Transformation Language Reference.
4. Expand the IDoc type to view all the header information in the IDoc definition.
Header information of an IDoc definition is in the format of the SAP structure EDIDC.
6. If you want to use mapping variables and parameters in the IDoc filter condition, click
the Variables tab to view a list of mapping variables and parameters.
7. Double-click the mapping variable or parameter name to enter it in the IDoc filter
condition.
8. Click Validate to validate the syntax of the IDoc filter.
♦ Enclose constants on the right side of the filter condition in single quotes.
♦ Filter conditions for character strings must match the full precision of the column if the
condition is numeric.
♦ All valid SAP operators can be used in IDoc filter conditions.
♦ Use a semicolon (;) or boolean operators (such as AND) to separate multiple conditions.
♦ Always leave a space after every token except commas.
Option Description
Tracing Level Sets the amount of detail included in the session log when you run a session containing
this transformation.
Program Flow Customizes the ABAP program with SAP functions, ABAP code blocks, variables, filters,
and join conditions.
IDoc Filter Specifies the filter condition for selecting IDoc source definitions.
7. Click Sources and indicate any additional source definitions you want to associate with
this Application Source Qualifier.
Transformation Datatypes
When you connect an SAP R/3 source definition to an Application Source Qualifier, the
Designer creates a port with a transformation datatype that is compatible with the SAP
datatype. When the ABAP program extracts from SAP, it stores all data, including dates and
numbers, in character buffers. When you configure an Application Source Qualifier, you
might want to change some of the dates and number datatypes to string to maintain accuracy
during conversion. For more information about datatypes, see “Datatype Reference” on
page 383.
153
Overview
After you create a mapping with an SAP R/3 source definition, create a session and use the
session in a workflow.
Complete the following tasks before you can run an SAP workflow:
♦ Configure an Integration Service. Configure an Integration Service to run SAP
workflows. For more information about configuring an Integration Service, see “Creating
and Configuring the Integration Service” in the PowerCenter Administrator Guide.
♦ Configure source and target connections. Configure connections to extract data from the
source and write data to the target. For example, to extract data from SAP, create an
application connection in the Workflow Manager. For more information about creating
an application connection for mySAP Option, see “Managing Connection Objects” in the
PowerCenter Workflow Administration Guide.
Table 8-1. SAP Streaming Reader and SAP Staging Reader Properties
For more information about configuring sessions to run in stream mode, see “Running
Stream Mode Sessions” on page 156. For more information about configuring sessions to run
in file mode, see “Running File Mode Sessions” on page 158.
Overview 155
Running Stream Mode Sessions
To run a session in stream mode, select SAP Streaming Reader as the source reader type in the
session properties. Select SAP Streaming Reader if you installed an ABAP program in the
Mapping Designer, and you selected stream mode when you generated the ABAP program.
Figure 8-1 shows the session properties for a stream mode session:
Select Reader
Type
If you have separate application connections for file and stream modes, select the application
connection configured to connect to the SAP system using CPI-C.
When you run a session in stream mode, the installed ABAP program creates buffers on the
application server. The program extracts source data and loads it into the buffers. When a
buffer fills, the program streams the data to the Integration Service using CPI-C. With this
method, the Integration Service can process data when it is received.
Tip: Stream mode sessions require an online connection between PowerCenter and SAP to
transfer data in buffers. For this reason, all stream mode sessions use a dialog process. You
might want to use streaming when the window of time on the SAP system is large enough to
perform both the extraction and the load.
Select Reader
Type
If you have separate application connections for file and stream modes, select the application
connection that is configured to connect to the SAP system using RFC.
When you run a file mode session, the installed ABAP program creates a staging file on the
application server. The program extracts source data and loads it into the file. The Integration
Service process is idle while the program extracts data and loads the file. When the file is
complete, the Integration Service accesses the file and continues processing the session. The
ABAP program then deletes the staging file unless you configure the session to reuse the file.
Tip: File mode sessions do not require an online connection between the Integration Service
and SAP while the generated ABAP program is extracting data. For this reason, you can run a
file mode session offline using a background process. Choose background processing when
the data volume is high and the extraction time exceeds the limit for dialog processes.
On Off ABAP program creates the staging file if it does not exist. If the staging file exists,
the ABAP program validates and reuses the file. If validation fails, the ABAP
program recreates the file. After the Integration Service reads the file, it remains on
the system for reuse.
On On ABAP program creates the staging file even if it exists. The file remains on the
system for reuse.
Off Off Integration Service reads the staging file and deletes the file.
Overriding Filters
When you persist the staging file, session-level overrides do not apply. For example, if a
staging file exists, and you want to apply a one-time session-level override for a filter
Modes of Access
You can access staging files for an SAP session in the following ways:
♦ File Direct
♦ NFS Mount
♦ FTP
File Direct
Use File Direct when the file system is shared between the two machines. There are two file
direct situations:
♦ The SAP host and the Integration Service host are on the same machine.
♦ The SAP host and the Integration Service host are on different machines, but have a
common view of the shared file system. Map a drive from the Integration Service to the
machine where the staging files reside.
The user accessing the file must be the user that runs the Integration Service. If the SAP
system is on Windows, that user must have standard read permissions on the directory where
you stage the file. For more information about SAP systems on UNIX, see “Enabling Access
to Staging Files on UNIX” on page 162.
NFS Mount
Use NFS Mount when the file path and name are different for the SAP system and the
Integration Service. There are two NFS situations:
♦ One host is Windows and the other is UNIX. Map a drive from the Integration Service to
the machine where the staging files reside. The path names map differently between the
two platforms.
♦ The file system shared between the two hosts are mounted differently. Map a drive from
the Integration Service to the machine where the staging files reside.
The user accessing the file must be the user that runs the Integration Service. If the SAP
system is on Windows, that user must have standard read permissions on the directory where
FTP
Use FTP when the Integration Service accesses the file system through an FTP connection.
There are two FTP situations:
♦ The FTP server is configured to view the entire file system. When the Integration Service
accesses SAP through FTP, the path to the file is identical.
♦ The FTP server is restricted to a certain directory or directories. The paths for the
Staging Directory and Source Directory are different.
Configure an FTP connection in the Workflow Manager. For more information about
creating an FTP connection, see “Managing Connection Objects” in the PowerCenter
Workflow Administration Guide.
The user who accesses the staging file must be the FTP user. If the SAP system is on
Windows, that user must have standard read permissions on the directory where you stage the
file. If the SAP system is on UNIX, see “Enabling Access to Staging Files on UNIX” on
page 162.
If the Integration Service fails to access the staging file through FTP, it logs the error message
returned by SAP in the session log. Use transaction ST22 from the SAP client to get more
information about the SAP error message.
1. The user accessing the staging file must create the staging directory.
2. Run the following UNIX command from the directory where you want to generate the
files:
% chmod g+s .
When you run this command, the staging files inherit the group ID of the directory, not
the SAP user that creates the file. There are no permission issues because the user who
accesses the file also owns the directory.
Note: If the SAP system is on Windows, the user who accesses the file must have standard read
permissions on the directory where you stage the file.
File Direct /data/sap /data/sap Paths are the same. Mapped drive if
different machines.
NFS /data/sap e:\sapdir Specify the path from each machine. Mapped drive.
FTP-Restricted e:\ftp\sap /sap FTP server is restricted to e:\ftp. You want FTP connection.
the file in e:\ftp\sap. Specify the full path in
the staging directory (e:\ftp\sap). Specify the
path from the restricted directory in the
source directory (/sap).
To configure a session:
1. In the Task Developer, double-click an SAP session to open the session properties.
The Edit Tasks dialog box appears.
2. On the Properties tab, select Fail Task and Continue Workflow or Restart Task for the
Recovery Strategy property.
3. Configure the remaining general options and performance settings as needed.
4. From the Connections settings on the Mapping tab (Sources node), select the connection
values for the SAP R/3 sources.
Tip: If you are configuring a file mode session, you can access the staging file using an
FTP connection.
5. Click Readers and select the appropriate readers for the SAP R/3 sources.
Required/
Attribute Name Description
Optional
Stage File directory Required SAP path containing the stage file.
Source File directory Required Integration Service path containing the source file.
Reinitialize the Stage File Optional If enabled, the ABAP program extracts data and replaces the
existing staging file. You can only enable this option if you also
enable Persist the Stage File.
Persist the Stage File Optional If enabled, the Integration Service reads an existing, valid
staging file. If the staging file does not exist or is invalid, the
ABAP program creates a new staging file.
If cleared, the Integration Service deletes the staging file after
reading it.
Default is disabled.
Run Session in Optional Use when the data volume is high and the extraction time is
Background very long.
Troubleshooting 169
170 Chapter 8: Configuring Sessions with SAP R/3 Sources
Part III: IDoc Integration Using
ALE
This part contains the following chapters:
♦ Creating Outbound IDoc Mappings, 173
♦ Creating Inbound IDoc Mappings, 193
♦ Configuring Workflows for IDoc Mappings Using ALE, 205
171
172
Chapter 9
173
Overview
You can configure PowerCenter Connect for SAP NetWeaver mySAP Option to receive
outbound SAP IDocs in real time as they are generated by mySAP applications. To receive
outbound IDocs, mySAP Option integrates with mySAP applications using Application Link
Enabling (ALE). ALE is an SAP proprietary technology that enables data communication
between SAP systems. ALE also enables data communication between SAP and external
systems. For more information about SAP, see “IDoc Integration Using ALE” on page 6.
Note: Receiving outbound SAP IDocs is different from sourcing IDocs from static EDIDC
and EDIDD structures. For more information about sourcing IDocs from EDIDC and
EDIDD structures, see “IDoc Definitions” on page 75.
When you use mySAP Option to receive outbound IDocs from mySAP applications using
ALE, you can capture changes to the master data or transactional data in the SAP application
database in real time. When data in the application database changes, the SAP system creates
IDocs to capture the changes and sends the IDocs to the Integration Service.
The Integration Service and SAP use transactional RFC (tRFC) communication to send and
receive IDocs. tRFC is an SAP method that guarantees the RFCs are executed only once. As a
result, the Integration Service receives each IDoc only once.
If the PowerCenter session is not running when the SAP system sends outbound IDocs, the
Integration Service does not receive the IDocs. However, the SAP system stores the outbound
IDocs in EDI tables, which are a staging area for guaranteed message delivery.
You can configure the SAP system to resend the IDocs during the PowerCenter session by
configuring the tRFC port used to communicate with the Integration Service. When you
configure the port, you can enable background processes in SAP that try to resend the IDocs
to the Integration Service a set number of times. For more information about configuring the
SAP system, see the SAP documentation.
Overview 175
Creating an SAPALEIDoc Source Definition
To receive outbound IDocs from SAP using ALE, create an SAPALEIDoc source definition in
the Designer. An SAPALEIDoc source definition represents metadata for outbound IDocs.
When you create an SAPALEIDoc source definition, the Designer displays a table with IDoc
fields and SAP datatypes. When the Integration Service extracts data from the SAP source, it
converts the data based on the datatypes in the Source Qualifier transformation associated
with the source.
An SAPALEIDoc source definition contains predefined ports. You cannot edit the ports.
Table 9-1 describes the SAPALEIDoc source definition ports:
Tip: You only need to maintain one SAPALEIDoc source definition per repository folder.
When you include an SAPALEIDoc source definition in a mapping, you can add an instance
of the source definition to the mapping.
Transformation Datatypes
The transformation datatypes in the Application Multi-Group Source Qualifier are internal
datatypes based on ANSI SQL-92 generic datatypes, which PowerCenter uses to move data
across platforms. When the Integration Service reads data from an SAP source, it converts the
data from its native datatype to the transformation datatype. When you run a session, the
Integration Service performs transformations based on the transformation datatypes. When
writing data to a target, the Integration Service converts the data based on the native
datatypes in the target definition.
The transformation datatype for all ports in the Application Multi-Group Source Qualifier
are predefined. You cannot change the datatype for any of the fields in the Application Multi-
Group Source Qualifier. For more information about transformation datatypes, see “Datatype
Reference” on page 383.
Figure 9-1. IDoc Display Tab of the SAP/ALE IDoc Interpreter Transformation
Select to display
transformation
metadata.
Select to display
Group Status
column.
Table 9-2. IDoc Information in SAP/ALE IDoc Interpreter and Prepare Transformations
Property Description
Group Status Required groups. Displays only if you select Show Group Status.
Transformation Datatype Transformation datatype of the field. Displays only if you select Show Transformation
Metadata.
Transformation Precision Transformation precision of the field. Displays only if you select Show Transformation
Metadata.
Transformation Scale Transformation scale of the field. Displays only if you select Show Transformation
Metadata.
E1MARAM group
E1MARCM group
Some segments and groups are required. In an SAP/ALE IDoc Prepare transformation, SAP/
ALE IDoc Interpreter transformation, and SAP DMI Prepare transformation, a required
segment must exist in the IDoc only if its group, its parent groups, and its parent segments
are required or selected. For example, in Figure 9-2, the E1MARAM group is required.
When you clear Show Group Status to hide the Group Status column, the Segment Status
column uses different rules to determine which segments are selected, depending on the type
of segment:
♦ Child segment that is not a parent. When the segment is required, the Segment Status
column is selected. For example, in Figure 9-3, the E1MAKTM segment is required, so its
Segment Status column is selected.
♦ Parent segment. When both the segment and its group are required, the Segment Status
column is selected. For example, in Figure 9-3, the E1MARAM segment and group are
required, so its Segment Status column is selected. The E1MARCM segment is required,
however its group is optional. The E1MARCM Segment Status column is cleared.
Figure 9-3. Segment Status Column Selections when Show Group Status is Cleared
Segment Status
column shows that
the E1MARCM
segment is not
required in the IDoc.
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name must be a user for
which you have created a source system connection.
Language Required Language you want for the mapping. The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
8. Click Connect.
After you connect to the SAP system, enter a filter to display specific IDoc types.
9. Select one of the following filter types:
♦ Message Type. Select to display IDocs by message type. The Designer displays the
basic and extended type for each IDoc that meets the filter condition.
♦ Basic IDoc Type. Select to display IDocs by basic IDoc type. The Designer displays
only the basic type for each IDoc that meets the filter condition.
♦ Extended IDoc Type. Select to display IDocs by extended IDoc type. The Designer
displays only the extended type for each IDoc that meets the filter condition.
10. Enter a filter condition.
12. To further refine the IDocs that display, you can select one or both of the following
options:
♦ Show Only Unknown Message Type. If IDocs display that are of an unknown
message type, you can select this option to display only those IDocs.
♦ Show Release For Message Type. Select to display IDocs by SAP release.
Click to view
basic and
extended
IDoc types.
14. Select the basic or extended IDoc whose metadata you want to import and click Next.
Select to
display Group
Status column.
Select the
segments to
include in the
transformation.
15. Select the IDoc segments you want to include in the mapping.
You can manually select the segments you want to include in the transformation. Or,
click one of the following options to select IDoc segments:
♦ Select All Segments. Click to include all segments.
♦ Clear All Segments. Click to remove all selected segments except required segments.
To display which groups are required, click Show Group Status. For more information
about the IDoc details that display in this step of the wizard, see Table 9-2 on page 179.
When you select the segments to include in the transformation, the transformation uses
the following rules to select parent and child segments:
♦ If you select a segment, all its parent segments are selected, and all its required child
segments are selected.
♦ If you clear a segment, all its child segments are cleared.
16. Click Next.
Step 3 of the wizard appears. The wizard provides a name for the transformation.
17. Optionally, modify the name of the transformation.
1. In the Transformation Developer or Mapping Designer, double-click the title bar of the
SAP/ALE IDoc Interpreter transformation.
The Edit Transformations window appears.
2. Select Output is Deterministic on the Properties tab if you want to enable recovery for an
outbound IDoc session.
Figure 9-5. Outbound IDoc Mapping with Invalid IDoc Error Handling
To write invalid outbound IDocs to a relational or flat file target, you must also configure the
outbound IDoc session to check for invalid IDocs. For more information, see “Configuring
Sessions for Outbound IDoc Mappings” on page 207.
Note: SAP/ALE IDoc Interpreter transformations created in version 7.x do not contain the
Error Output port. As a result, the Integration Service cannot validate outbound IDocs for
SAP/ALE IDoc Interpreter transformations created in earlier versions. You must recreate the
transformation to write invalid outbound IDocs to a relational or flat file target.
193
Overview
You can configure PowerCenter Connect for SAP NetWeaver mySAP Option to send inbound
SAP IDocs to mySAP applications. To send inbound IDocs, mySAP Option integrates with
mySAP applications using Application Link Enabling (ALE). ALE is an SAP proprietary
technology that enables data communication between SAP systems. ALE also enables data
communication between SAP and external systems.
For example, you have a legacy application that processes sales transactions. You want to
synchronize the transactional data in the legacy application with the data in the SAP
application database. Use an inbound SAP IDoc mapping to send the transactional data from
the legacy application database to the SAP system. The Integration Service extracts the data
from the legacy application data source, prepares the data in SAP IDoc format, and sends the
data to the SAP system as inbound IDocs using ALE.
For more information about SAP, see “IDoc Integration Using ALE” on page 6
Overview 195
Working with the SAP/ALE IDoc Prepare Transformation
You need to include an SAP/ALE IDoc Prepare transformation in an inbound IDoc mapping.
The transformation receives data from upstream transformations in the mapping and
interprets the segment data.
Each SAP/ALE IDoc Prepare transformation can only interpret data for a particular IDoc
type. You can include multiple SAP/ALE IDoc Prepare transformations to represent multiple
IDoc types in a single mapping.
After you create an SAP/ALE IDoc Prepare transformation, you can edit the transformation
to set values for control record segments and to change the data segments you want to include
in the transformation.
When you edit the transformation, you can also view details about the IDoc type and
segments. To view details, double-click the title bar of the transformation and select the IDoc
Display tab. For more information, see Table 9-2 on page 179.
CONTROL_INPUT_ABSEN1 GPK_DOCNUM P1
E2ABSE1 GPK_E2ABSE1 C1
GFK_DOCNUM_E2ABSE1 P1
E2ABSE2 GPK_E2ABSE2 C2
GFK_DOCNUM_E2ABSE2 P1
GFK_E2ABSE2_E2ABSE2A C2
E2ABSE3 GPK_E2ABSE3 C3
GFK_DOCNUM_E2ABSE3 P1
GFK_E2ABSE2_E2ABSE2A C3
E2ABSE4 GPK_E2ABSE4 C4
GFK_DOCNUM_E2ABSE4 P1
The IDoc Prepare transformation uses these primary and foreign key relationships to
maintain the structure of the IDoc data. Any foreign key field that does not match the
1. In the Transformation Developer or Mapping Designer, double-click the title bar of the
SAP/ALE IDoc Prepare transformation.
The Edit Transformations window appears.
Segments in
black are
editable.
Segments in
gray are not
editable.
The IDoc Control Record tab displays the control record segments in the IDoc, their
values, and precision. The Designer provides values for some of the segments when you
create an SAP/ALE IDoc Prepare transformation. You can provide values for other
segments. The Integration Service writes these values to the SAP system during the
session.
You can enter values in the following ways:
♦ Manually enter values for segments.
♦ Connect to the SAP system to get predefined values for required segments.
3. If you do not want to connect to the SAP system to get IDoc control record segment
values, enter values for the segments you want.
You can also enter a mapping variable for any segment. For more information about
mapping variables, see “Mapping Parameters and Variables” in the PowerCenter Designer
Guide.
4. To get predefined values for required control record segments, click Get Partner Profile to
connect to the SAP system.
Tip: If you want to connect to the SAP system to get values for required control record
segments, and you imported the IDoc metadata for the transformation from file, you can
enter a value for MESTYP before clicking Get Partner Profile. When you connect to the
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name you use to
connect to SAP must be a user for which you have created a source system
connection.
Language Required Language you want for the mapping. If you leave this option blank,
PowerCenter uses the default language of the SAP system. The language
you choose must be compatible with the PowerCenter Client code page. For
more information about language and code pages, see “Language Codes,
Code Pages, and Unicode Support” on page 395.
Message
Type
7. If you imported IDoc metadata for the SAP/ALE IDoc Prepare transformation or entered
a value for MESTYP on the IDoc Control Record tab, click Select. Go to step 9.
8. If there is no value for the message type on the IDoc Control Record tab, click Next until
you find the appropriate message type.
Figure 10-1. Inbound IDoc Mapping with Invalid IDoc Error Handling
To write invalid inbound IDocs to a relational or flat file target, you must also configure the
inbound IDoc session to check for invalid IDocs. For more information, see “Configuring
Sessions for Inbound IDoc Mappings” on page 212.
205
Overview
After you create an outbound or inbound mapping in the Designer, you create a session and
use the session in a workflow.
Complete the following tasks before you can run an SAP workflow:
♦ Configure an Integration Service. Configure an Integration Service to run SAP
workflows. For more information about configuring the Integration Service, see “Creating
and Configuring the Integration Service” in the PowerCenter Administrator Guide.
♦ Configure source and target connections. Configure connections to extract data from the
source and write data to the target. For more information about creating an application
connection, see “Managing Connection Objects” in the PowerCenter Workflow
Administration Guide.
You cannot use a connection variable for an SAP_ALE_IDoc_Reader or
SAP_ALE_IDoc_Writer application connection. For more information about connection
variables, see the PowerCenter Workflow Administration Guide.
206 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
Configuring Sessions for Outbound IDoc Mappings
When you configure a session for an outbound IDoc mapping to receive IDocs from SAP
using ALE, configure the following properties:
♦ Session conditions
♦ Message recovery
♦ Pipeline partitioning
♦ IDoc validation
♦ Continuous workflows
Session Conditions
When you configure a session for an outbound IDoc mapping, you can set session conditions
to define how the Integration Service reads IDocs from the source.
You can define the following session conditions:
♦ Idle Time
♦ Packet Count
♦ Real-time Flush Latency
♦ Reader Time Limit
Session
Conditions
When you enter values for multiple session conditions, the Integration Service stops reading
IDocs from the SAP source when the first session condition is met. For example, if you set the
Idle Time value to 10 seconds and the Packet Count value to 100 packets, the Integration
Service stops reading IDocs from the SAP source after 10 seconds or after reading 100
packets, whichever comes first.
Idle Time
Use the Idle Time session condition to indicate the number of seconds the Integration Service
waits for IDocs to arrive before it stops reading from the SAP source. For example, if you
enter 30 for Idle Time, the Integration Service waits 30 seconds after reading from the source.
If no new IDocs arrive within 30 seconds, the Integration Service stops reading from the SAP
source.
Packet Count
Use the Packet Count session condition to control the number of packets the Integration
Service reads from SAP before stopping. For example, if you enter 10 for Packet Count, the
Integration Service reads the first 10 packets from the SAP source and then stops. The packet
Size property in the ALE configuration determines the number of IDocs the Integration
Service receives in a packet.
208 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
If you enter a Packet Count value, and you configure the session to use pipeline partitioning,
the outbound IDoc session can run on a single node only. The Integration Service that runs
the session cannot run on a grid or on primary and backup nodes.
Message Recovery
You can configure message recovery for sessions that fail when reading IDocs from an SAP
source. When you enable message recovery, the Integration Service stores all IDocs it reads
from the SAP source in a cache before processing the IDocs for the target. If the session fails,
run the session in recovery mode to recover the messages the Integration Service could not
process during the previous session run.
During the recovery session, the Integration Service reads the IDocs from the cache. After the
Integration Service reads the IDocs from the cache and processes them for the target, the
recovery session ends.
The Integration Service removes IDocs from the cache after the Real-time Flush Latency
period expires. It also empties the cache at the end of the session. If the session fails after the
Integration Service commits IDocs to the target but before it removes the IDocs from the
cache, targets may receive duplicate rows during the next session run.
Complete the following steps to enable message recovery:
1. Select a recovery strategy in the session properties.
2. Specify a recovery cache folder in the session properties at each partition point.
210 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
Note: You also must select Output is Deterministic in the SAP/ALE IDoc Interpreter
transformation to enable recovery. For more information, see “Editing an SAP/ALE IDoc
Interpreter Transformation” on page 189.
If you change the mapping or the session properties and then restart the session, the
Integration Service creates a new cache file and then runs the session. This can result in data
loss during the session run.
For more information about message recovery, see “Real-time Processing” in the PowerCenter
Workflow Administration Guide.
Continuous Workflows
You can schedule a workflow to run continuously. A continuous workflow starts as soon as
the Integration Service initializes. When the workflow stops, it restarts immediately. To
schedule a continuous workflow, select Run Continuously from the Schedule tab in the
scheduler properties when you schedule the workflow. For more information about
scheduling workflows, see “Working with Workflows” in the PowerCenter Workflow
Administration Guide.
Pipeline Partitioning
You can increase the number of partitions in a pipeline to improve session performance.
Increasing the number of partitions enables the Integration Service to create multiple
connections to sources and process partitioned data concurrently.
When you configure an inbound IDoc session to use pipeline partitioning, use key range
partitioning to make sure that all data belonging to an IDoc message is processed in the same
logical partition. Use the port connected to the GPK_DOCNUM port in the SAP/ALE IDoc
Prepare transformation as the partition key.
The transformation where you define the partitioning depends on the type of source
definition the mapping contains. If the mapping contains a relational source definition,
define key range partitioning for the Source Qualifier transformation. If the mapping
contains a flat file source definition, the Source Qualifier transformation does not support key
range partitioning for a flat file source definition. Therefore, include an Expression
transformation in the inbound IDoc mapping preceding the SAP/ALE IDoc Prepare
transformation. Define key range partitioning for the Expression transformation.
For more information about partitioning and a list of all partitioning restrictions, see
“Understanding Pipeline Partitioning” in the PowerCenter Workflow Administration Guide.
212 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
Sending IDocs to SAP
The Integration Service sends IDoc messages to SAP as a packet. By default, SAP allows
packets of 10 MB. The SAP administrator can change this packet size configuration. When
you configure how the Integration Service sends IDocs, ensure that the packet size is equal to
or less than the packet size configured in SAP.
To determine how the Integration Service sends IDocs, select one of the following options for
Send IDocs Based On in the session properties:
♦ Packet Size. The Integration Service sends IDoc messages as a packet based on the value
you set for the Packet Size property.
♦ Commit Call. The Integration Service sends IDoc messages as a packet at every commit
point in the session.
When you send IDocs based on packet size, the Integration Service stores the IDoc messages
in memory until the total count reaches the packet size. It then sends the messages as a packet
to SAP. A larger packet size reduces the number of calls to SAP. However, if the PowerCenter
session fails, the Integration Service must resend a larger amount of data during the next
session run. Calculate the value for the Packet Size session property based on the packet size
configuration in SAP and the maximum number of rows per IDoc message that you expect to
send to SAP.
For example, you configured SAP to handle a packet of 10 MB. One row in an IDoc message
is equal to 1000 bytes. You are sending IDoc messages that contain a maximum of 50 rows.
Set the Packet Size property to 200.
When you send IDocs based on commit call, the Integration Service commits IDocs to SAP
based on the commit properties you configure in the session. The Integration Service sends
IDoc messages as a packet at every commit point in the session.
To ensure that the commit occurs at the IDoc message boundary, use a user-defined commit.
In a user-defined commit, the Integration Service commits IDocs based on transactions you
define in the mapping properties. If you use a source-based commit, the Integration Service
may send partial IDocs to SAP. For more information about setting commit properties, see
“Understanding Commit Points” in the PowerCenter Workflow Administration Guide.
Extended
Syntax
Check
214 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
When the session completes, the Integration Service releases cache memory and deletes the
cache files. You may find cache files in the cache directory if the session does not complete
successfully.
1. In the Task Developer, double-click an SAP session to open the session properties.
2. If you are configuring an outbound IDoc session, select a recovery strategy from the
General Options in the Properties tab.
To enable message recovery, select Resume from Last Checkpoint.
If you enable message recovery, you can configure a value for the recovery cache folder
from the Properties settings of the Mapping tab (Sources node). Or, use the default cache
folder $PMCacheDir\.
3. In the Config Object tab, configure advanced settings, log options, and error handling
properties.
4. Click the Mapping tab.
216 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
5. From the Connections settings on the Mapping tab (Sources node), select the connection
values for the sources.
Default
Session Condition Description
Value
Idle Time -1 SAP can remain idle for an infinite period of time before the PowerCenter
session ends.
Packet Count -1 Integration Service can read an infinite number of packets before the
session ends.
Reader Time Limit 0 Integration Service can read IDocs from SAP for an infinite period of
time.
Real-time Flush Latency 0 Integration Service does not run the session in real time.
218 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
8. If you are configuring inbound IDoc session, click the Properties settings.
Required/
Property Description
Optional
Packet Size Required Number of IDocs you want the Integration Service to send in a packet
to SAP. For more information, see “Configuring Sessions for Inbound
IDoc Mappings” on page 212.
Number of Retries Required Number of times you want the Integration Service to attempt to
connect to the SAP system.
Delay Between Retries Required Number of seconds you want the Integration Service to wait before
attempting to connect to the SAP system if it could not connect on a
previous attempt.
220 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
11. Enter the following properties, depending on whether you are configuring a session for
an outbound or inbound IDoc mapping:
Required/ Outbound/
Property Description
Optional Inbound
Duplicate Parent Required Both Determines how the Integration Service handles duplicate
Row Handling parent rows during a session. Choose one of the following
values:
- First Row. The Integration Service passes the first duplicate
row to the target. The Integration Service rejects rows with
the same primary key that it processes after this row.
- Last Row. The Integration Service passes the last duplicate
row to the target.
- Error. The Integration Service passes the first row to the
target. Rows that follow with duplicate primary keys
increment the error count. The session fails when the error
count exceeds the error threshold.
For more information about parent rows in IDoc data, see
“IDoc Primary and Foreign Keys” on page 196.
Orphan Row Required Both Determines how the Integration Service handles orphan rows
Handling during a session. Choose one of the following values:
- Ignore. The Integration Service ignores orphan rows.
- Error. The session fails when the error count exceeds the
error threshold.
For more information about orphan rows in IDoc data, see
“IDoc Primary and Foreign Keys” on page 196.
Extended Syntax Required Both Checks for invalid IDocs. For more information, see
Check “Outbound IDoc Validation” on page 211 or “Inbound IDoc
Validation” on page 213.
NULL Field Required Inbound Determines how the Integration Service handles fields with a
Representation null value when preparing the data in IDoc format. Choose
one of the following values:
- Blank. The Integration Service inserts all blanks for the field.
- Slash (/). The Integration Service inserts a single slash (/) for
the field.
Cache Directory Required Inbound Default directory used to cache inbound IDoc or DMI data. By
default, the cache files are created in a directory specified by
the variable $PMCacheDir. If you override the directory, make
sure the directory exists and contains enough disk space for
the cache files. The directory can be a mapped or mounted
drive.
Cache Size Required Inbound Total memory in bytes allocated to the Integration Service for
the caching of data prepared by SAP/ALE IDoc Prepare or
SAP DMI Prepare transformations. Default is 10 MB.
222 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
Running Workflows
When the Integration Service writes inbound IDocs to the SAP system, SAP does not report
status details back to PowerCenter. Therefore, if SAP fails to post an IDoc, or if an error
occurs after PowerCenter calls SAP, the PowerCenter session log will not contain the reason
for the error. However, you may be able to access detailed information from within SAP.
If a PowerCenter call to SAP fails, PowerCenter writes the error to the session log. For more
information, see the SAP documentation.
224 Chapter 11: Configuring Workflows for IDoc Mappings Using ALE
Part IV: Data Integration Using
RFC/BAPI Functions
This part contains the following chapters:
♦ Working with RFC/BAPI Function Mappings, 227
♦ Configuring RFC/BAPI Function Sessions, 263
225
226
Chapter 12
227
Overview
You can process data in SAP directly from PowerCenter Connect for SAP NetWeaver mySAP
Option using RFC/BAPI function mappings. When you use an RFC/BAPI function mapping
to process data in SAP, you do not have to install the ABAP program in the Designer for data
integration. The Integration Service makes the RFC/BAPI function call on SAP during a
workflow and processes the SAP data. Use an RFC/BAPI function mapping to extract data
from SAP, change data in SAP, or load data into an SAP system.
To process data in SAP, create an RFC/BAPI function mapping in the Designer with the
Generate RFC/BAPI Function Mapping Wizard. During mapping generation, you select
RFC/BAPI functions and the function parameters for each RFC/BAPI function you want to
use in the mapping. RFC/BAPI functions use the function parameter values to perform tasks.
During a session, the Integration Service makes RFC/BAPI function calls on SAP using the
function parameter values you pass to the RFC/BAPI functions.
RFC/BAPI functions have the following parameters:
♦ Scalar input parameters. SAP function parameters that require scalar input values. Some
BAPI functions require scalar inputs to perform tasks. Use the scalar input parameters in
an RFC/BAPI function when you want to pass scalar input values to the RFC/BAPI
function.
♦ Scalar output parameters. Scalar output values that a BAPI function returns after
performing a task. Use scalar output parameters in an RFC/BAPI function when you want
the RFC/BAPI function to return data from SAP as scalar outputs to the function call.
♦ Table parameters. SAP structures that have more than one row. Table parameters can be
input, output or both. Use the table parameters when you want to pass table input values
to an RFC/BAPI function. You can also use table parameters when you want the RFC/
BAPI function to return data from SAP as table outputs to the function call.
♦ Changing parameters. SAP function parameters that can require both input and output
values. RFC/BAPI function mappings do not support changing parameters in RFC/BAPI
function signatures.
Overview 229
RFC/BAPI functions that allow you write access to data in SAP may or may not require scalar
or table inputs for the function call. If the RFC/BAPI function you want to use in the
mapping does not require any scalar or table input for the function call, generate a No Input
mapping in the Mapping Designer. If the function requires scalar or table inputs, generate a
Multiple Stream mapping or a Single Stream mapping in the Mapping Designer.
When you generate an RFC/BAPI Multiple Stream or Single Stream mapping to write data
into SAP, supply the data for the scalar and table input parameters for the RFC/BAPI
functions. When the Integration Service runs a session, it uses the scalar and table parameter
input data you provide to make the RFC/BAPI function call on SAP.
RFC/BAPI functions that write data to SAP can also produce scalar or table outputs. When
the Integration Service receives scalar or table outputs from a write RFC/BAPI function call,
it interprets the data and passes it to the next transformations in the mapping.
For more information about No Input mappings, see “Working with RFC/BAPI No Input
Mappings” on page 232. For more information about Multiple Stream mappings, see
“Working with RFC/BAPI Multiple Stream Mappings” on page 236. For more information
about Single Stream mappings, see “Working with RFC/BAPI Single Stream Mappings” on
page 246.
Figure 12-1. Viewing RFC/BAPI Function Metadata from the Mapplet Input Output Ports
Overview 231
Working with RFC/BAPI No Input Mappings
Use an RFC/BAPI No Input function mapping to extract data from SAP. When you generate
an RFC/BAPI No Input function mapping, select RFC/BAPI function signatures that do not
require any scalar or table input to perform tasks. For example, you want to get a list of
company codes from SAP. Use the BAPI function BAPI_COMPANYCODE_GETLIST in
the mapping. The BAPI_COMPANYCODE_GETLIST function signature does not require
any scalar or table input from the user. When the Integration Service makes the
BAPI_COMPANYCODE_GETLIST function call, SAP returns a list of company codes in
the SAP system.
To create an RFC/BAPI No Input mapping, use the Generate RFC/BAPI Function Mapping
Wizard in the Designer. During mapping generation, the wizard prompts you to select the
RFC/BAPI functions you want to use in the mapping. For RFC/BAPI No Input mappings,
select No Input as the mapping generation context. The wizard creates a mapping based on
the selections.
Table 12-1 lists the transformations in an RFC/BAPI No Input mapping:
Transformation Description
Source definition Mapping includes a Virtual plug-in source definition for each RFC/BAPI signature you use in
the mapping. For more information, see “Source Definition in an RFC/BAPI No Input
Mapping” on page 233.
Source Qualifier Mapping includes an Application Multi-Group Source Qualifier transformation for each Virtual
plug-in source definition in the mapping. For more information, see “Source Qualifier in an
RFC/BAPI No Input Mapping” on page 233.
FunctionCall mapplet Mapping includes a FunctionCall_No_Input mapplet for each RFC/BAPI function signature
you use in the mapping. In the No Input RFC/BAPI mapping, the FunctionCall_No_Input
mapplet makes the RFC/BAPI function call on SAP and interprets the data that the function
call returns. For more information, see “FunctionCall_No_Input Mapplet in an RFC/BAPI No
Input Mapping” on page 233.
Target definition for Mapping includes a Virtual plug-in target definition for each scalar or table output parameter
function output you use in the RFC/BAPI function. The wizard includes the Virtual plug-in target in the
mapping for mapping validation. Replace the Virtual plug-in target definitions with appropriate
targets for the target warehouse. For more information, see “Target Definition in a No Input
Mapping” on page 235.
When you generate an RFC/BAPI No Input mapping, the mapping contains a separate
pipeline for each RFC/BAPI function signature you use in the mapping.
Since an RFC/BAPI No Input mapping does not require any scalar or table input, the
FunctionCall_No_Input mapplet in the mapping does not receive any input from a source to
make the RFC/BAPI function call. The mapplet connects to the Virtual plug-in source
definition in the mapping for mapping validation. For more information about the Virtual
plug-in source definition in an RFC/BAPI No Input mapping, see “Source Definition in an
RFC/BAPI No Input Mapping” on page 233.
The FunctionCall_No_Input mapplet contains a group for all scalar output parameters for
the function signature you select during mapping generation. It also contains a group for each
table output parameter you select during mapping generation. Each output group in the
mapplet contains output ports for the scalar or table output fields.
For example, you want to create a mapping for BAPI_COMPANYCODE_GETLIST, and
you specify COMPANYCODE_LIST as the table output parameter for the BAPI function
call. The COMPANYCODE_LIST table parameter contains the fields COMP_CODE and
COMP_NAME. When you generate the mapping, the SAP RFC/BAPI Function Mapping
Wizard creates a group for COMPANYCODE_LIST in the mapplet. The group contains
output ports for the fields COMP_CODE and COMP_NAME.
When you run a session for an RFC/BAPI No Input mapping, the FunctionCall_No_Input
mapplet makes the RFC/BAPI function call on SAP. When SAP returns scalar or table output
to the function call, the FunctionCall mapplet receives and interprets the output data. The
mapplet then passes the interpreted data to the target in the mapping.
Transformation Description
Source definition for Mapping includes Virtual plug-in source definitions for the scalar and table input parameters
function input data of an RFC/BAPI function signature. For more information, see “Source Definition in an RFC/
BAPI Multiple Stream Mapping” on page 238.
Source Qualifier Mapping includes an Application Multi-Group Source Qualifier transformation for each Virtual
plug-in source definition in the mapping. For more information, see “Source Qualifier for an
RFC/BAPI Multiple Stream Mapping” on page 239.
Prepare mapplet Mapping includes a Prepare mapplet for each RFC/BAPI function signature you use in the
mapping. For more information, see “Prepare Mapplet in an RFC/BAPI Multiple Stream
Mapping” on page 240.
RFCPreparedData Mapping contains a set of flat file target definitions that receive input from the Prepare
target definition mapplets in the mapping. Each Prepare mapplet in the mapping passes data to a
corresponding RFCPreparedData (flat file) target definition. The RFCPreparedData target
definitions represent staging files where the Integration Service writes the prepared data from
the Prepare mapplets. For more information, see “RFCPreparedData Target Definition in an
RFC/BAPI Multiple Stream Mapping” on page 241.
Transformation Description
RFCPrepared Data Mapping contains an RFCPreparedData (flat file) source definition. It represents the
source definition RFCPreparedData indirect file, which the Integration Service uses to read data from the
RFCPreparedData staging files. For more information, see “RFCPreparedData Source
Definition in an RFC/BAPI Multiple Stream Mapping” on page 241.
Source Qualifier for Mapping includes a Source Qualifier transformation for the RFCPreparedData source
RFCPreparedData definition in the mapping. For more information, see “Source Qualifier for RFCPreparedData
source Source Definition in an RFC/BAPI Multiple Stream Mapping” on page 242.
FunctionCall_MS Mapping includes a FunctionCall_MS mapplet for each RFC/BAPI function signatures you
mapplet use in the mapping. The Multiple Stream FunctionCall_MS mapplet makes the RFC/BAPI
function call to extract data from SAP. It also interprets the data that the function call returns.
For more information, see “FunctionCall_MS Mapplet in an RFC/BAPI Multiple Stream
Mapping” on page 243.
Target definition for Mapping includes a Virtual plug-in target definition for each scalar or table output parameter
function output you use for the RFC/BAPI functions. The wizard includes the Virtual plug-in target definitions
in the mapping for mapping validation. Replace the Virtual plug-in target definitions with the
appropriate definitions for the target warehouse. For more information, see “Target Definition
in an RFC/BAPI Multiple Stream Mapping” on page 244.
When you generate an RFC/BAPI multiple stream mapping, the mapping contains a prepare
pipeline for each input parameter in the RFC/BAPI function signatures. The prepare pipeline
receives data from the source and flattens the data for each input parameter to a single field
called PreparedData. It then writes the PreparedData to the RFCPreparedData staging file
target.
The mapping also contains a function call pipeline that reads data from the
RFCPreparedData staging files and then uses the data as function input to make the RFC/
BAPI function calls on SAP. If the mapping contains RFC/BAPI functions that return
function outputs from SAP, the function call pipeline also writes the RFC/BAPI function call
outputs to the target definitions in the mapping.
When the RFCPrepare mapplet receives data from the source definitions in the mapping, it
flattens the input data for each function parameter into the PreparedData output port.
During the session, the Prepare mapplet also generates a Sequence ID for the RFC/BAPI
function. The SequenceID determines the order in which the Integration Service makes the
RFC/BAPI function call on SAP in the function call pipeline. The Prepare mapplet generates
the SequenceID for the function based on the order in which you select the function during
mapping generation. For more information about selecting RFC/BAPI functions for mapping
generation, see “Generating SAP RFC/BAPI Mappings” on page 255.
The mapplet output ports in the Prepare mapplet pass the PreparedData, TransactionID,
SequenceID, FunctionName, IntegrationID, and ParameterName for each parameter to the
RFCPreparedData target definition.
The RFCPreparedData target definition uses 5000 characters as the precision for the
PreparedData port. If the RFC/BAPI function you use in the mapping requires the
PreparedData precision to be greater than 5000 characters, the Integration Service fails the
session and logs an error in the session log. The error message in the session log indicates the
required precision for the PreparedData port in the RFCPreparedData target. To correct the
error, replace the RFC/PreparedData target definition in the mapping with a target definition
that has the required precision for the PreparedData port to support the RFC/BAPI function
you are using.
For example, you can copy the RFCPreparedData target definition in the Designer. Save the
copy of the RFCPreparedData target definition with a different name. For example, you can
name the copy RFCPreparedData_Copy. Modify the precision of the PreparedData port in
the RFCPreparedData_Copy target definition. Replace the RFCPreparedData target
definition with the RFCPreparedData_Copy target definition in the mapping. For
information about copying objects, see the PowerCenter Designer Guide.
When you configure a session for an RFC/BAPI multiple stream mapping, select Indirect as
the source filetype value for the RFCPreparedData source. During a session for an RFC/BAPI
multiple stream mapping, the Integration Service uses the file list to read data from the
RFCPreparedData staging files and passes the data to the RFCTransaction_Support mapplet
in the function call pipeline.
Note: If you select Direct as the source file type for the RFCPreparedData source during
session configuration, the RFC/BAPI function call fails.
Figure 12-10. Transaction_Support Mapplet in the Function Call Pipeline in a Multiple Stream Mapping
The mapplet receives input from the Transaction_Support mapplet routes data to the
RFCPreparedData source in the individual FunctionCall_MS mapplets for each
function call pipeline. RFC/BAPI function you use in the mapping.
FunctionCall_MS mapplet
passes table outputs from
function call to target.
Note: If an RFC/BAPI function call returns 99999999 as the value for a DATS field, the
transformations in the RFC/BAPI mapping pass the date value as 12/31/9999 to the output.
Transformation Description
Source definition Mapping includes a Virtual plug-in source definition for the scalar and table input parameters of
for function input each RFC/BAPI function signature you use in the mapping. Replace the Virtual plug-in source
data definitions with source definitions that pass data to the scalar and table input parameters of the
RFC/BAPI functions in RFC format. For more information, see “Source Definition in an RFC/BAPI
Single Stream Mapping” on page 247.
Source Qualifier Mapping includes an Application Multi-Group Source Qualifier transformation for each Virtual
plug-in source definition in the mapping. For more information, see “Source Qualifier in an RFC/
BAPI Single Stream Mapping” on page 250.
Sorter Mapping includes a Sorter transformation to sort source data by TransactionID and
transformation IntegrationID. It passes the sorted data to the FunctionCall_SS mapplet. For more information,
see “Sorter Transformation in an RFC/BAPI Single Stream Mapping” on page 251.
FunctionCall_SS Mapping includes an FunctionCall_SS mapplet for each RFC/BAPI function signature you use in
mapplet the mapping. The mapplet makes the RFC/BAPI function call on SAP. The mapplet also makes
commit calls on SAP for transaction control. If the RFC/BAPI function call returns output, the
mapplet interprets the data that the function call returns and passes it to the targets. For more
information, see “FunctionCall_SS Mapplet in an RFC/BAPI Single Stream Mapping” on
page 251.
Target definition Mapping includes a Virtual plug-in target definition for each scalar or table output parameter you
use for the RFC/BAPI functions. The wizard includes the Virtual plug-in target definitions in the
mapping for mapping validation. Replace the Virtual plug-in target definitions with the
appropriate definitions for the target warehouse. For more information, see “Target Definition in
an RFC/BAPI Single Stream Mapping” on page 252.
PLANNEDORDER COMPDATAPLANT
MATERIAL COMPDATAMATERIAL
PLANT ENTRYQTY
PRODQTY
ORDERSTARTDATE
ORDERENDDATE
MANUALCOMP
Create a flat file source definition that passes the scalar and table inputs to the
FunctionCall_SS mapplet for the BAPI_PLANNEDORDER_CREATE function. Include
KEY fields for IntegrationID and TransactionID in the flat file source definition. Use an
indicator field for Scalar Input since you want to pass scalar inputs to the function call.
Include an indicator field for the table input parameter COMPONENTSDATA. Then
include fields for each scalar input and table input parameter fields for the function.
Table 12-4 lists the fields you can define in the source definition to pass the scalar and table
inputs for the function BAPI_PLANNEDORDER_CREATE in a single stream mapping:
Table 12-4. Preparing Source Data for RFC/BAPI Single Stream Mapping
TransactionID KEY field that identifies a set of data for a single logical unit of work. The FunctionCall_SS
mapplet makes a commit call on SAP when it receives a complete set of records with the same
TransactionID.
IntegrationID KEY field that uniquely identifies a set of data for an RFC/BAPI function call. The
FunctionCall_SS mapplet makes the RFC/BAPI function call on SAP when it receives a
complete set of records with the same IntegrationID.
Scalar input indicator field Indicator field for scalar inputs to the RFC/BAPI function. Include this field in the source
definition when you want to pass scalar inputs to an RFC/BAPI function call. You can supply
any value for this field to indicate that the source contains data for scalar inputs to the RFC/
BAPI function. If you pass scalar inputs to an RFC/BAPI function and you do not supply a value
for the scalar input indicator field, the RFC/BAPI function call may return an error from SAP.
One or more table input Include an indicator field for each table input parameter you use in the mapping. In this
indicator fields example, the source contains a field for the table input parameter, COMPONENTSDATA. You
can supply any value for this field to indicate that the source contains data for the table input
parameter COMPONENTSDATA. If you pass table inputs to an RFC/BAPI function and you do
not supply a value for the corresponding table input indicator field, the RFC/BAPI function call
may return an error from SAP.
One or more scalar input Include a field for each scalar input field you want to use for the scalar input parameter. In this
fields example, the source contains the following fields for the scalar input parameter,
HEADERDATA:
- PLANNEDORDER
- MATERIAL
- PLANT
- PRODQTY
- ORDERSTARTDATE
- ORDERENDDATE
- MANUALCOMP
One or more table input Include a field for each table input field for the table input parameter. In this example, the
fields source contains the following fields for the table input parameter COMPONENTSDATA:
- COMPDATAPLANT
- COMPDATAMATERIAL
- ENTRYQTY
For example, if you use a comma delimited flat file source to pass the scalar and table inputs
to the FunctionCall_SS mapplet, supply a value for each field in the source delimited by a
comma. If a particular row does not pass a value for a scalar or table input parameter field, the
row can only contain a comma delimiter for that field and no other value. For example, you
do not want to supply values for the table input parameter fields COMPDATAPLANT and
COMPDATAMATERIAL for a number of records. In the flat file source definition, no
values can be associated with the COMPDATAMATERIAL and ENTRYQTY fields for these
records.
After you create the source data for the RFC/BAPI scalar and table input parameters, import
a flat file source definition in the Designer from the flat file source.
Figure 12-14 shows the source definition for BAPI_PLANNEDORDER_CREATE in a single
stream mapping:
Figure 12-14. Flat File Source Definition for BAPI_PLANNEDORDER_CREATE Single Stream Mapping
For more information about importing a flat file source definition, see “Working with
Targets” in the PowerCenter Designer Guide.
When you replace the Virtual plug-in source definition and the Application Multi-Group
Source Qualifier in the mapping with the appropriate source definition and Source Qualifier
to pass the RFC/BAPI function inputs to the FunctionCall_SS mapplet, connect the
TransactionID and IntegrationID ports to the corresponding ports in the Sorter
transformation. If you pass scalar and table input data to the FunctionCall_SS mapplet, link
the scalar input and table input indicator ports to the corresponding ports in the Sorter
transformation. For more information about the Sorter transformation, see “Sorter
Transformation” in the PowerCenter Transformation Guide.
Figure 12-17. Return Structure Tab on the Generate SAP RFC/BAPI Mapping Wizard
Custom Return
Structure
Status Indicator
Field
Status Indicators
for Warning, Error,
or Abort
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name must be a user for
which you have created a source system connection.
Language Required Language you want for the mapping.The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
Note: The Generate RFC/BAPI Mapping Wizard only displays RFC functions. It does
not display non-RFC functions.
When you generate an RFC/BAPI mapping to process data in SAP, you can select RFC/
BAPI function signatures from only one SAP system to use in the same mapping. Do not
select RFC/BAPI function signatures from multiple SAP systems to use in the same
mapping.
5. Select the BAPI functions you want to use in the mapping and click Next.
Note: Do not select functions that invoke the SAP GUI. Otherwise, the session fails.
Use the arrows to order the sequence of the RFC/BAPI function calls in the mapping.
The Integration Service makes the RFC/BAPI function calls on SAP based on the
sequence you define during mapping generation.
Note: If you define the sequence of the RFC/BAPI function calls for the mapping and use
the Back button to return to the previous screen, the sequence of RFC/BAPI function
calls you define for the mapping is lost. Redefine the sequence of the RFC/BAPI function
calls.
6. Define the following parameters for each RFC/BAPI function in the mapping:
♦ Table
♦ Return Structure
Note: The Generate SAP RFC/BAPI Mapping Wizard selects the scalar input and output
parameters required by the functions you select. RFC/BAPI functions do not support
changing parameters.
Mapping Generation
Description
Context
No Input Select if the BAPI functions you are using do not require any scalar or table input. For
more information about RFC/BAPI No Input mappings, see “Working with RFC/BAPI No
Input Mappings” on page 232.
Single Stream Input Select if you supply the prepared data for the scalar and table inputs to the RFC/BAPI
function calls. A Single Stream mapping does not contain a prepare pipeline. For more
information about Single Stream mappings, see “Working with RFC/BAPI Single
Stream Mappings” on page 246.
Multiple Stream Input Select if you want the mapping to prepare data for the scalar and table inputs to the
RFC/BAPI function calls. A Multiple Stream mapping contains a prepare pipeline for the
scalar input parameters. It also contains a prepare pipeline for each table input
parameter. For more information about RFC/BAPI Multiple Stream Mappings, see
“Working with RFC/BAPI Multiple Stream Mappings” on page 236.
8. Click Next.
If the Designer does not detect predefined return structures for the functions you want to
use in the mapping, you can define custom return structures for the functions. Click the
Return Structure tab.
9. Optionally, define the following custom return structure parameters for the functions
and click Next:
Return Structure Select a structure from the list to designate as the return structure for the
function.
Text Field Select a field from the structure for text messages.
Status Indicator For Warning Enter an indicator message for warning. By default, the wizard uses W for
warning.
Status Indicator for Error Enter an indicator message for error. By default, the wizard uses E for error.
Status Indicator for Abort Enter an indicator message for abort. By default, the wizard uses A for abort.
Mapping Name Required Enter a name for the mapping. Do not use special characters in the
mapping name. For a list of special characters that you cannot use in
the Designer, see the PowerCenter Designer Guide.
Mapping Parameter: Required By default, the wizard provides a mapping parameter name as the
RFC/BAPI Connection RFC/BAPI connection name. You can enter a different mapping
parameter name for the connection name. For more information about
using mapping parameters, see “Mapping Parameters and Variables”
in the PowerCenter Designer Guide.
Mapping Parameter: Required Enter the application connection name that you use for the session in
RFC/BAPI Connection the Workflow Manager. Use an application connection of type SAP
Value RFC/BAPI Interface. For more information about configuring an SAP
RFC/BAPI Interface application connection in the Workflow Manager,
see “Managing Connection Objects” in the PowerCenter Workflow
Administration Guide.
Mapping Parameter: Required Select Yes or No as the value for the $$CONTINUE_ON_ERROR
Continue Session on mapping parameter. Default is Yes. When you have more than one
Error Value RFC/BAPI function in a multiple stream mapping, the wizard sets the
parameter value to No.
When you select Yes, the Integration Service logs error messages in
the session log when an RFC/BAPI function returns an error during the
session. It then continues to run the session to complete executing the
remaining function calls. However, if the RFC/BAPI function returns an
ABORT status, the Integration Service is unable to continue the
session, and the session stops.
When you select No, the Integration Service stops the session at the
first error returned from the RFC/BAPI function call. You can update
the value of the $$CONTINUE_ON_ERROR mapping parameter using
the parameter file. If you update the parameter file and do not specify
a value for the $$CONTINUE_ON_ERROR mapping parameter, the
Integration Service stops the session. For more information about
using mapping parameters, see “Mapping Parameters and Variables”
in the PowerCenter Designer Guide.
Configuring RFC/BAPI
Function Sessions
This chapter includes the following topics:
♦ Overview, 264
♦ Configuring a Session for SAP RFC/BAPI Function Mappings, 265
♦ Error Handling for RFC/BAPI Sessions, 269
263
Overview
After you create a mapping in the Designer, you create a session and use the session in a
workflow.
Complete the following tasks before you can run an SAP workflow:
♦ Configure an Integration Service. Configure an Integration Service to run SAP
workflows. For more information about configuring an Integration Service, see “Creating
and Configuring the Integration Service” in the PowerCenter Administrator Guide.
♦ Configure source and target connections. Configure connections to extract data from the
source and write data to the target. For example, to process data in SAP, create an SAP R/3
application connection within the Workflow Manager. For more information about
creating an application connection, see “Managing Connection Objects” in the
PowerCenter Workflow Administration Guide.
1. In the Task Developer, double-click an SAP session to open the session properties.
The Edit Tasks dialog box appears.
For more information about setting source file type for a multiple stream RFC/BAPI
function mapping, see “Configuring a Session for SAP RFC/BAPI Function Mappings”
on page 265.
4. On the Targets node, enter the connection values for the targets in the mapping.
Set File
Properties
271
272
Chapter 14
273
Overview
You can migrate data from legacy applications, other ERP systems, or any number of other
sources into mySAP applications. Create an SAP Data Migration Interface (DMI) mapping to
prepare data for migration into mySAP applications. After you create a DMI mapping, you
can configure a session for the mapping. During the session, the Integration Service extracts
the data from the data source and prepares the data in an SAP format flat file that you can
load into SAP. For more information about configuring sessions for DMI mappings, see
“Configuring Sessions for DMI Mappings” on page 284.
CONTROL_INPUT_ABSEN1 GPK_DOCNUM P1
E2ABSE1 GPK_E2ABSE1 C1
GFK_DOCNUM_E2ABSE1 P1
E2ABSE2 GPK_E2ABSE2 C2
GFK_DOCNUM_E2ABSE2 P1
GFK_E2ABSE2_E2ABSE2A C2
E2ABSE3 GPK_E2ABSE3 C3
GFK_DOCNUM_E2ABSE3 P1
GFK_E2ABSE2_E2ABSE2A C3
E2ABSE4 GPK_E2ABSE4 C4
GFK_DOCNUM_E2ABSE4 P1
The SAP DMI Prepare transformation uses these primary and foreign key relationships to
maintain the structure of the DMI data. Any foreign key field that does not match the
Figure 14-1. SAP DMI Prepare Transformation Button on the Transformations Toolbar
Creates an SAP DMI
Prepare transformation.
1. In the Transformation Developer, click the SAP DMI Prepare transformation button.
The pointer becomes a cross.
2. Click in the Transformation Developer workspace.
The Generate SAP DMI Transformation Wizard appears.
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP source system connection user name. The user name must be a user for
which you have created a source system connection.
Language Required Language you want for the mapping. The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
7. Click Connect.
8. Expand a Data Transfer Object, select an Activity, and click Next.
Step 2 of the wizard appears.
Select to
display Group
Status column.
Select the
segments to
include in the
transformation.
1. In the Transformation Developer or Mapping Designer, double-click the title bar of the
SAP DMI Prepare transformation.
The Edit Transformations dialog box appears.
Select to display
transformation
metadata.
Select to
display Group
Status column.
Select the
segments to
include in the
transformation.
3. Optionally, modify the segments you want to include in the SAP DMI Prepare
transformation.
You can modify optional segments. Select only those segments for which you have source
data.
Click one of the following options to select DMI segments:
♦ Select All Segments. Click to include all segments.
♦ Clear All Segments. Click to remove all selected segments except required segments.
To display which groups are required, click Show Group Status. For more information
about the DMI details that display in the DMI Display tab, see Table 9-2 on page 179.
When you select the segments to include in the transformation, the transformation uses
the following rules to select parent and child segments:
♦ If you select a segment, all its parent segments are selected, and all its required child
segments are selected.
♦ If you clear a segment, all its child segments are cleared.
4. Click OK.
285
286
Chapter 15
287
Overview
PowerCenter Connect for SAP NetWeaver mySAP Option integrates with SAP business
content to provide an efficient, high-volume data warehousing solution. SAP business content
is a collection of metadata objects that can be integrated with other applications and used for
analysis and reporting. mySAP applications produce the business content data, and
PowerCenter consumes it. PowerCenter can consume all or changed business content data
from mySAP applications and write it to a target data warehouse.
Informatica provides an XML file that you can use to import mappings and workflows that
integrate with SAP business content. For more information, see “Step 2. Import PowerCenter
Objects” on page 295.
DataSources
PowerCenter consumes business content data from SAP DataSources. A DataSource is a
customized business object used to retrieve data from internal SAP sources, including SAP
function modules like SAP Financials. It contains the following components:
♦ An extraction structure that describes the fields containing extracted data
♦ An extraction type
♦ An extraction method that transfers data to an internal table of the same type as the
extraction structure
SAP provides the following types of DataSources:
♦ Master data attributes
♦ Master data text
♦ Hierarchy
♦ Transaction
Use standard SAP DataSources or custom DataSources. SAP predefines all standard
DataSources. You need to create custom DataSources. For more information about creating
custom DataSources in SAP, see the SAP documentation. For more information about
hierarchies, see “Hierarchy Definitions” on page 68.
Send Flat file source SAP/ALEIDoc target Source is the request file created when
Request you create the processing mapping.
Overview 289
Workflows for Business Content Integration
You need to import several PowerCenter workflows from BCI_Mappings.xml. You use the
imported listener workflow and the send request and processing workflows that you create to
integrate with SAP business content.
Use the following workflows to integrate with SAP business content:
1. Listener workflow. Receives DataSource data from SAP, stages it for the processing
workflow, and requests that the Integration Service start the appropriate processing and
send request workflows for the DataSource. You configure and use the imported listener
workflow.
2. Send request workflow. Sends requests for DataSource data to SAP. You create one send
request workflow for each datasource.
3. Processing workflow. Processes DataSource data staged by the listener workflow, writes
data to the target, and cleans up the data staged by the listener workflow. You create one
processing workflow for each processing mapping.
Overview 291
Figure 15-1 shows how the send request and processing workflows interact with the listener
workflow and SAP:
SAP 2
1
8 Listener
Send Request
Workflow
Workflow
4
3
Processing Workflow
Processing 5
Session Staging Area
6 7
Cleanup
Session
Overview 293
Step 1. Prepare DataSources in SAP
Before you create a processing mapping, activate DataSources in SAP. You can also customize
whether DataSource fields are visible. Use the following SAP transactions to activate
DataSources and customize DataSource fields in SAP:
♦ RSA5 Transfer DataSources function. Changes the status of DataSources from Delivered
to Active.
♦ RSA6 Postprocess DataSources and Hierarchy. Customizes the visibility of fields in a
DataSource.
1. In SAP, open transaction RSA5 for the DataSource for which you want to create a
processing mapping.
2. Execute the Transfer DataSources function.
The DataSource is now active. For more information, see the SAP documentation.
1. In SAP, open transaction RSA6 for the DataSource that you want to customize.
2. Select a DataSource and click DataSource > Display DataSource.
3. Select Hide Field for fields you want to hide.
When you create a processing mapping, you will not be able to see hidden fields and
cannot consume data from them.
4. Clear Hide Field for fields you want to make visible.
5. Click the Save button.
For more information, see the SAP documentation.
1. In the Target Designer, add the Source_For_BCI target definition to the workspace.
2. Edit the Source_For_BCI target definition.
3. On the Table tab, verify that the database type matches the relational database, and click
OK.
4. Select the Source_For_BCI target definition, and click Targets > Generate/Execute SQL.
5. Click Connect.
6. Select the ODBC connection, enter the user name and password, and click Connect.
7. Select Create Table, and clear Primary Key and Foreign Key.
Note: The IDocRecord column must be the primary key in the Designer but not in the
Source_For_BCI relational table in the database. A primary or foreign key in the
Source_For_BCI table in the database causes the cleanup session to fail.
8. Click Generate and execute SQL.
The Designer creates the database table using the default table name Source_For_BCI.
Connection
Description
Option
Code Page* Code page compatible with the SAP server. Must also correspond to the Language Code.
Domain Name Name of the domain for the associated Integration Service.
6. Click OK.
To verify the basic IDoc type in the Router transformation of the listener mapping:
4. If the basic IDoc type in the SAP system is not ZSIK1000, modify the group filter
condition to match the basic IDoc type of the SAP system.
BCI_Scheduling_Target
RSINFOStaging Target
Indicator Target
Source_For_BCI Target
1. In the Workflow Designer, drag the listener workflow into the workspace.
2. Open the session properties for s_BCI_listener.
Update Modes
When you create a mapping, specify one of the following update modes:
♦ Full. Extracts all data that meets the selection parameters.
♦ Delta. Extracts only data that has changed since the last delta update. However, the first
delta update extracts all data.
A series of delta updates is called a delta queue. When you create a processing mapping and
select delta update instead of full update, select one of the following delta update options:
♦ Delta initialization with transfer. Creates a delta queue.
♦ Delta update. Updates existing delta queues created with the delta initialization option.
♦ Delta repeat. Repeats a previous delta update if there were errors.
When you create the processing mapping, you can initialize more than one delta queue for
each DataSource, depending on your business needs. Use multiple delta queues when you
want to select data from a non-continuous range that cannot be defined in a single delta
queue.
For example, to compare data from the item numbers 1 through 4 to item numbers 11
through 14, you need to use two delta queues because the item numbers are not in one
continuous range.
The following table shows two delta queues for two item ranges updated at the end of each
quarter:
Delta Queue 1 Delta Initialization Delta Update Delta Update Delta Update
Items 1 through 4 Full Update Changed Data Changed Data Changed Data
Delta Queue 2 Delta Initialization Delta Update Delta Update Delta Update
Items 11 through 14 Full Update Changed Data Changed Data Changed Data
Request Files
A request file requests data from an SAP DataSource. Each DataSource has one request file.
When you create a processing mapping, you select a local directory in which to store the
request file. When you save a request file, the processing mapping wizard saves it using the
following syntax:
<DataSource_name>_<update_mode>
For example, if you save a request file for the 0ACCOUNT_ATTR DataSource in full update
mode, the request file is:
0ACCOUNT_ATTR_Full_Update
If you have an existing request file for the DataSource and update mode, you can revert to the
contents of the saved request file to overwrite the current settings. When you save the current
settings, and a request file for the DataSource and update mode exists, the current settings
overwrite the contents of the request file.
After you create a processing mapping, deploy the request file so that you can use it when you
configure a session to send requests to SAP. For more information, see “Step 5. Deploy the
Request Files” on page 312.
Control Record
Target
Parameter File
Target
Source Definition
Based on the
Source_For_BCI Table.
DataSource Data
Target
DocumentNumber
Target
Control Record
Target
DocumentNumber
Target
Parameter File
Target
Note: The name of the source definition in processing mappings is based on the name of the
DataSource. However, the source for all processing mappings is the Source_For_BCI
relational table. For more information about the Source_For_BCI source table, see “Creating
Database Tables for PowerCenter Objects” on page 296.
Required/
Field Description
Optional
Connect String Required Type A or Type B destination entry in the saprfc.ini file.
User Name Required SAP logical system user name. The user name you use to connect to SAP
must be a user for which you have created a logical system. For more
information, see “Defining PowerCenter as a Logical System in SAP” on
page 31.
Logical System Required Logical system name of PowerCenter as configured in SAP for business
content integration. For more information, see“Defining PowerCenter as a
Logical System in SAP” on page 31.
Default is INFACONTNT.
Language Required Language you want for the mapping. The language must be compatible with
the PowerCenter Client code page. For more information about language and
code pages, see “Language Codes, Code Pages, and Unicode Support” on
page 395.
If you leave this option blank, PowerCenter uses the default language of the
SAP system.
Find Button
Application
Component
DataSource
Field Description
DataSource Name SAP name of the DataSource used for the mapping.
To select the transfer mode and activate the ABAP extraction program:
1. In step 2 of the Generate Mapping for BCI Wizard, select a transfer mode:
♦ tRFC. Sends data faster and requires fewer resources than IDoc, but cannot be used for
hierarchy DataSources.
♦ IDoc. Stages all data to IDocs before sending them to the Integration Service. IDoc is
the default for hierarchy DataSources.
2. Click Activate DataSource to activate the ABAP extraction program in SAP for the
DataSource.
If you activated this DataSource in another processing mapping, then you do not need to
activate the DataSource again. However, you need to reactivate the DataSource when one
of the following conditions is true:
♦ This mapping uses a different transfer mode.
♦ The DataSource metadata has changed.
Otherwise, the session you create for this mapping fails.
3. Click Next.
Step 3 of the wizard appears.
Optionally, enter a
selection value range for
delta update initialization.
When you perform a full update, optionally enter a selection value range in the Selection
Criteria area. Enter a selection value range if you want to filter the data. If you are creating a
processing mapping for a hierarchy DataSource, perform the same steps to configure the
request file and data extraction parameters. However, before you send the request file, you
must select a particular hierarchy within the hierarchy DataSource.
Browse to select a
directory.
Optionally, enter a
selection value range for
full update.
1. In step 3 of the Generate Mapping for BCI Wizard, optionally click Revert to populate
all settings from a previously saved request file for the DataSource.
If you click Revert, go to “Naming and Generating the Processing Mapping” on
page 310.
2. Enter a directory for the request file or click the Browse button and select a directory.
3. Select an update mode:
♦ Full. Extracts all the data. If you select Full, go to step 7.
♦ Delta. Extracts the data that has changed since the last update.
4. Select a delta update mode:
♦ Delta Init with Transfer. Creates the delta queue. The next delta update extracts data
that has changed since the delta queue was initialized.
♦ Delta. Extracts changed data since the last delta update.
♦ Delta Repeat. Repeats the previous delta update in case of errors.
5. Select an existing Delta Initialization Request.
- or -
1. In step 4 of the Generate Mapping for BCI Wizard, optionally modify the default name
for the mapping if you selected a non-hierarchy DataSource.
If you selected a hierarchy DataSource, you cannot modify the default name.
2. Optionally, enter a description for the mapping.
3. Click Generate Mapping.
4. Click Yes to close the wizard or No if you want to leave the wizard open to create more
processing mappings.
5. Click Finish.
To override the default query for a non-hierarchy DataSource on Oracle or IBM DB2:
1. In the Target Designer, add the relational target definitions for the processing mapping to
the workspace.
For non-hierarchy processing mappings, add the following relational target definitions to
the workspace:
♦ ControlRecord
♦ BIC_CII<DataSource_name>
♦ DocumentNumber
For the hierarchy processing mapping, add the following relational target definitions to
the workspace:
♦ ControlRecord
♦ E2RSHIE000
♦ E2RSHTX000
♦ E2RSHND000
♦ E2RSHIV000
♦ E2RSHFT000
♦ DocumentNumber
2. Edit each relational target definition and verify that the database type matches the
database type.
3. Select all the relational target definitions.
4. Click Targets > Generate/Execute SQL.
5. Click Connect.
6. Select the ODBC connection, enter the user name and password, and click Connect.
7. Select Create Table, Primary Key, and Foreign Key.
8. Click Generate and Execute SQL.
DS3 pr_DS3
The Integration Service requests the DS1 DataSource first. It uses the sr_DS1 send request
workflow. BCI_Scheduling_Target does not schedule the sr_DS1 send request workflow.
Instead, start the sr_DS1 workflow. BCI_Scheduling_Target requests that the Integration
Service start the remaining workflows in the following order:
1. When BCI_Scheduling_Target receives the complete data for DS1, it requests that the
Integration Service start the pr_DS1 processing workflow.
2. When pr_DS1 completes, BCI_Scheduling_Target requests that the Integration Service
start the sr_DS2 send request workflow. This workflow requests data from SAP for the
DS2 DataSource.
3. When BCI_Scheduling_Target receives the complete data for the DS2 DataSource, it
requests that the Integration Service start the pr_DS2 processing workflow.
4. When pr_DS2 completes, BCI_Scheduling_Target requests that the Integration Service
start the sr_DS3 send request workflow. This workflow requests data from SAP for the
DS3 DataSource.
Example
Table 15-3 lists the sample processing and send request workflows that you imported from
BCI_Mappings.xml:
Processing/
Workflow Name Action
Send Request
Figure 15-8 displays how BCI_Scheduling_Target schedules these sample processing and send
request workflows:
Data Sources
Processing
Workflows
Add a New Key
Send Request
Workflows
Add a New
Task
Sessions for
Selected
Workflow
Required/
Property Description
Optional
Key Required Name of the DataSource that you want to process. You can enter each
DataSource only once.
Primary Workflow Required Name of the processing workflow that processes this DataSource.
Name
Secondary Required/ Name of the send request workflow that sends a request to SAP for the next
Workflow Name Optional DataSource. Do not enter a send request workflow for the last DataSource
that you process.
I cannot see the DataSource for which I want to create a processing mapping.
You may not have activated the DataSource within SAP. Activate the DataSource within SAP.
For more information, see “Step 1. Prepare DataSources in SAP” on page 294.
Not all the fields in the DataSource are visible when I create a processing mapping.
Some fields in the DataSource may be hidden. Use transaction RSA6 in SAP to clear the
hidden fields. For more information, see “Step 1. Prepare DataSources in SAP” on page 294.
The update did not return the data I expected. How can I make sure everything is working
properly?
There may be problems within SAP or the DataSource may have hidden fields. Use
transaction RSA3 in SAP to test the extraction within SAP using the same data selection
criteria as the delta update. You can then compare the results with the results from the delta
update. For more information about hidden fields, see the previous troubleshooting tip.
321
322
Chapter 16
323
Overview
To use BW data as a source in a PowerCenter session, you first create an InfoSpoke in BW.
You can configure the InfoSpoke to filter and transform the data before writing it to the SAP
transparent table or file.
In the PowerCenter Designer, import the SAP transparent table or file as an SAP R/3 source
definition when creating a mapping. In the Workflow Manager, create a workflow and session
for the mapping.
To automate extracting data from BW, use the ZPMSENDSTATUS ABAP program that
Informatica provides. When executed, the ABAP program calls the SAP BW Service which
initiates the PowerCenter workflow. You then create a process chain to link the InfoSpoke
data extraction program with the ZPMSENDSTATUS ABAP program. Finally, you can
configure the process chain in BW to perform the data extraction and initiate the call to the
PowerCenter workflow automatically on a scheduled basis.
Complete the following steps to configure the BW system and PowerCenter Connect for SAP
NetWeaver BW Option to extract data from BW:
1. Create an InfoSpoke. Create an InfoSpoke in the BW system to extract the data from
BW and write it to an SAP transparent table or file.
2. Create a mapping. Create a mapping in the Designer that uses the SAP transparent table
or file as an SAP R/3 source definition.
3. Create a PowerCenter workflow to extract data from BW. Create a PowerCenter
workflow to automate the extraction of data from BW.
4. Create a process chain to extract data. A BW process chain links programs together to
run in sequence. Create a process chain to link the InfoSpoke and ABAP programs.
When the data extraction has begun, you can view log events to help you track the
interactions between PowerCenter and SAP BW.
To create an InfoSpoke:
1. In the BW Administrator Workbench, click Tools > Open Hub Service > Create
InfoSpoke.
2. Enter a unique name for the InfoSpoke.
3. Select a Template.
4. Right-click and select Change.
The Create InfoSpoke window appears.
5. From the General tab in the Create InfoSpoke window, select a data source from which
to extract data.
If you select an InfoCube as the data source for the InfoSpoke, there may be
discrepancies between the current InfoCube data and the extracted data written to the
SAP transparent table. The discrepancies are due to short dumps. For more information,
see the BW documentation.
6. From the Destination tab, select DB Table or File.
7. Enter a destination for the SAP transparent table or file.
8. Optionally, enable Delete Table Before Extraction.
Use this option if you want to extract the BW data even if it has not changed since the
last time you ran the InfoSpoke in full extraction mode. If you do not select this option
and the BW data has not changed since the last time you ran the InfoSpoke in the full
extraction mode, you receive the SAPSQL_ARRY_INSERT_DUPREC exception.
9. From the InfoObjects tab, select the InfoObjects to define the extract structure.
10. Optionally, enter selection criteria to restrict the data being extracted.
11. Optionally, add transformations to perform on the data being extracted.
12. Save and Activate the InfoSpoke.
13. Run the InfoSpoke to extract the data and create the SAP transparent table or file that
you use as a source to create a mapping.
1. From the Administrator Workbench in BW, click SAP Menu > Administration >
RSPC - Process Chains.
The Process Chain Maintenance Planning View window appears.
2. Click Create.
The New Process Chain dialog box appears.
3. Enter a unique name for the process chain and enter a description.
4. Click Enter.
The Insert Start Process dialog box appears.
5. Click Create.
The Start Process dialog box appears.
6. Enter a unique name for the start process variant and enter a description.
7. Click Enter.
The Maintain Start Process window appears.
8. Click Change Selections to schedule the process chain.
1. In the Process Chain Maintenance Planning View window, click Process Types.
2. From the Process Types menu, click Load Process and Post-Processing > Data Export
Into External Systems.
3. The Insert Data Export into External Systems dialog box appears.
4. For the Process Variants field, click the browse button to select the InfoSpoke you
created.
5. Click Enter.
The InfoSpoke process appears in the Process Chain Maintenance Planning View
window.
6. Click the start process description and drag to link the start process to the InfoSpoke
process.
1. In the Process Chain Maintenance Planning View window, click Process Types.
2. From the Process Types menu, click General Services > ABAP Program.
The Insert ABAP Program dialog box appears.
3. Click Create.
The ABAP Program dialog box appears.
4. Enter a unique name for the ABAP program process variant and enter a description.
The BW Process Chain log reports an invalid folder, workflow, or session name.
The folder, workflow, or session name you specified when creating the process chain is
inaccurate. Check the workflow for the exact folder, workflow, or session name and update
the process chain accordingly.
I ran a session that successfully extracted data from SAP BW, but the PowerCenter
Administration Console log includes irrelevant messages about the session.
This may occur if you entered an invalid value for the CONTEXT field in the
ZPMSENDSTATUS ABAP program when creating the process chain. You must enter OHS
for the CONTEXT field in a process chain that extracts data.
Troubleshooting 333
334 Chapter 16: Extracting Data from BW
Part VIII: BW Data Loading
335
336
Chapter 17
Building BW Components
to Load Data into BW
This chapter includes the following topics:
♦ Overview, 338
♦ Step 1. Create an InfoSource, 343
♦ Step 2. Assign an External Logical System, 345
♦ Step 3. Activate the InfoSource, 346
337
Overview
To load data into SAP BW, first create components within the BW system. You create an
InfoSource, assign it to the PowerCenter logical system that you created in SAP BW, and
activate the InfoSource. When you create and activate the InfoSource, you specify the
InfoSource type and the transfer method that the PowerCenter workflow uses to load data
into the InfoSource.
SAP BW Hierarchy
You can create a PowerCenter workflow to load data into an SAP BW hierarchy. When you
want to load data into a hierarchy, create an InfoSource for master data in the target SAP BW
system. After you create the InfoSource, import the hierarchy as a target definition into the
Designer. When you import the definition of a hierarchy, the Designer creates a transfer
structure with fields that make up the structure of the SAP BW hierarchy.
An SAP BW hierarchy is a tree-like structure that defines classes of information. Each level of
the hierarchy represents a different class. An SAP BW hierarchy displays an SAP BW
characteristic, which is a reference object with dimensions. The hierarchy is structured and
grouped according to individual evaluation criteria based on the characteristic. The structure
at each level of the hierarchy is called a node.
A hierarchy has the following types of nodes:
♦ Root node. The root node is the highest node in the structure and is the origin of all other
nodes. The root node represents the hierarchy.
♦ Child node. A child node is a node that is subordinate to another node.
♦ Leaf node. Leaf nodes are the lowest level in the hierarchy. Leaf nodes do not have any
successor nodes.
Asia N. America
ID = 2 ID = 3 Child Node
The root node represents the Sales department in an organization with two regions as child
nodes and four cities as leaf nodes.
You can configure the structure of a hierarchy in the SAP BW InfoSource properties. The
following settings define the structure of an SAP BW hierarchy:
♦ Sorted hierarchy. A sorted hierarchy defines the sequence of nodes for each level of a
hierarchy. When you specify a sorted hierarchy, each child node is ordered in relation to
its sibling nodes.
♦ Interval hierarchy. An interval hierarchy specifies a range of characteristic values. You can
use one interval to represent multiple leaf nodes. You define an interval in SAP BW by
setting a range of values.
♦ Time-dependent hierarchy. A time-dependent hierarchy specifies a range of dates. You
can define either the entire hierarchy or the hierarchy structure as time-dependent. When
the entire hierarchy is time-dependent, it is valid only in a defined range of dates. When
the hierarchy structure is time-dependent, the nodes in the hierarchy change for specified
periods of time while the name and version of the hierarchy remain static. You define a
date range in SAP BW by setting a range of dates.
♦ Version-dependent hierarchy. Version-dependent hierarchies have the same name, but
different versions.
When you import an InfoSource with a hierarchy as an SAP BW target definition, the
Designer only creates the fields in the transfer structure definition that are necessary to
replicate the structure of the SAP BW hierarchy defined in SAP BW.
Overview 339
Table 17-1 shows the fields the Designer can create in the SAP BW target definition when you
import metadata for a hierarchy definition from SAP BW:
NODEID All hierarchies Local and unique identifier for the hierarchy node.
PARENTID All hierarchies NODEID of the parent node of the hierarchy node.
CHILDID Sorted hierarchy NODEID of the first child node of the hierarchy node.
NEXTID Sorted hierarchy NODEID of the sibling node following the hierarchy node.
If you import different versions of a hierarchy into an SAP BW target definition, the
hierarchy name includes the version. If you import a hierarchy with time-dependent values,
the hierarchy name contains the date range specified for the hierarchy.
During a PowerCenter workflow, the Integration Service loads the hierarchy into the transfer
structure fields. For example, you can load the sample hierarchy in Figure 17-1 into the
transfer structure defined in Table 17-2.
Table 17-2 shows the transfer structure for the sample hierarchy in Figure 17-1:
3 0HIER_NODE N. 1 6 N. N. N.
America America America America
4 0LOCATION Tokyo 2 5
5 0LOCATION Bangalore 2
6 0LOCATION Redwood 3 7
City
7 0LOCATION Austin 3
Overview 341
Steps to Build BW Components to Load Data into BW
To load data into SAP BW, build the following components in the SAP BW Administrator
Workbench:
1. Create the InfoSource in SAP BW. The InfoSource represents a provider structure. You
create it in the SAP BW Administrator Workbench, and you import the definition into
the PowerCenter Target Designer.
2. Assign the InfoSource to the PowerCenter logical system. After you create an
InfoSource, assign it to the PowerCenter logical system that you created in SAP BW.
3. Activate the InfoSource. When you activate the InfoSource, you activate the InfoObjects
and the transfer rules.
After you build and activate the components, you can import InfoSources into PowerCenter
and build mappings. For more information, see “Building PowerCenter Objects to Load Data
into BW” on page 347.
Creating an InfoSource
You can create InfoSources in SAP BW using the Administrator Workbench.
To create an InfoSource:
Parameter Description
1. From the Administrator Workbench, right-click the InfoSource and select Change.
2. Select the InfoObjects and move them into the communication structure.
3. Click the Activate button.
4. Select the Transfer rules tab.
5. Select one of the following transfer methods and click Activate:
Method Description
IDoc Use IDoc to synchronously move data from transfer structure to InfoCube.
For more information about the transfer methods, see “Transfer Methods for Loading
Data into SAP BW” on page 341.
Note: If you want to populate an InfoCube, also define it in the SAP BW Administrator
Workbench. Define the update rules to update the InfoCube from the transfer structure.
Building PowerCenter
Objects to Load
Data into BW
This chapter includes the following topics:
♦ Overview, 348
♦ Step 1. Import an InfoSource, 349
♦ Step 2. Create a Mapping, 351
♦ Filtering Data to Load into SAP BW, 352
347
Overview
After you create and activate an InfoSource in SAP BW, use the Designer to import the
InfoSource as an SAP BW target definition. You can then include the SAP BW target
definition in a mapping to write data to SAP BW during a PowerCenter session.
To import an InfoSource:
1. From the Target Designer, click Targets > Import from SAP BW.
2. From the Import SAP BW Metadata dialog box, enter the following information:
Field Description
Connect String Type A or Type B destination entry in the saprfc.ini file. When you import an SAP BW
InfoSource for the first time, the Designer reads the saprfc.ini file and displays the first DEST
entry as the connect string. After the first time you import an SAP BW InfoSource, the
Designer stores and displays the DEST entry used for the previous import. Select the DEST
parameter that specifies the SAP BW source system from which you want to import
InfoSources.
Language The language in which you want to receive messages from the SAP BW system while
connected through this dialog box. Use a language code valid for the SAP BW system you are
connecting to. If you leave this blank, PowerCenter uses the default language of the SAP
system to connect to SAP BW.
InfoSources
If you import an InfoSource from the Master Transfer List, you can select to import
Attribute and/or Text InfoSources.
5. Select the InfoSources you want to import.
♦ Hold down the Shift key to select blocks of sources.
♦ Hold down the Ctrl key to make non-contiguous selections within a folder.
♦ Use the Select All button to select all tables.
♦ Use the Select None button to clear all selections.
6. Click Add To Import List.
7. To view the list, click View Import List.
The Import List dialog box appears.
8. To remove items from the list that you do not want to import, select the item and click
Delete.
9. Click Close to close the Import List dialog box.
You cannot include multiple InfoSources with the same name from different source
systems when you select InfoSources to import. Import multiple InfoSources with the
same name from different source systems separately.
10. When the Import List is complete, click OK.
The InfoSource definitions display as target tables in the Target Designer. SAP BW may
add /BIC/ to the SAP BW InfoObject name.
Tip: If the target displays the correct InfoSource name, but no ports display, make sure you
correctly followed the steps to create an InfoSource in SAP BW.
Table 18-1. Sample Data Selection Entry in SAP BW for a Relational Source
When the SAP BW Scheduler sends the SAP BW Service a workflow request, the SAP BW
Service receives the data selection information for the relational source and writes it to the
temporary parameter file. For example, the SAP BW Service writes the following to the
temporary parameter file for the data selections defined in Table 18-1:
$$BWFILTERVAR=“EmpId” >= ‘2222’ AND “EmpId” <= ‘9999’ AND
(“DeptId” = ‘804’)
During a PowerCenter workflow, the Integration Service uses the value of the
$$BWFILTERVAR mapping parameter to filter data from the Oracle source.
For more information about creating a mapping parameter to extract data from a relational
source to load into SAP BW, see “Configuring Mapping Parameters for SAP BW Data
Selection” on page 357. For more information about entering a filter condition in a Source
Qualifier transformation, see “Source Qualifier Transformation” in the PowerCenter
Transformation Guide.
Table 18-2. Sample Data Selection Entry in SAP BW for a Flat File Source
After configuring the data selection entry in the SAP BW InfoPackage, create mapping
parameters for the values that define the data selection.
Table 18-3 shows the mapping parameters you create for the data selection defined in
Table 18-2:
$$EMPID_FROM_0 Defines the start value for the range of the data selection entry.
$$EMPID_TO_0 Defines the end value for the range of the data selection entry.
Use the mapping parameters in the filter condition of the Filter transformation. For example,
to represent the data selection entry you defined in the SAP BW InfoPackage, enter the
following in the filter condition:
EmpId >= $$EMPID_FROM_0 AND
EmpId <= $$EMPID_TO_0
During the workflow, the Integration Service uses the temporary parameter file to obtain
values for the $$EMPID_FROM_0 and $$EMPID_TO_0 mapping parameters in the data
selection entry. The Integration Service then uses the data selection entry to filter data from
the source.
For more information on naming a mapping parameter for SAP BW data selection for flat file
sources, see “Mapping Parameter for a Flat File or SAP R/3 Source” on page 358. For more
information about creating mapping parameters for SAP BW data selection, see “Configuring
Mapping Parameters for SAP BW Data Selection” on page 357. For more information about
entering a data selection entry in the Filter transformation, see “Filter Transformation” in the
PowerCenter Transformation Guide. For more information about configuring and viewing
data selection entries in SAP BW, see “Step 2. Configure an InfoPackage” on page 364.
Table 18-4. Sample Data Selection Entries in SAP BW for an SAP R/3 Source
After you configure the data selection entry in the SAP BW InfoPackage, you create mapping
parameters for the values that define the data selection.
Table 18-5 shows the mapping parameters you create for the data selection defined in
Table 18-4:
$$MATNR_FROM_0 Defines the start value for the range of the data selection entry.
$$MATNR_TO_0 Defines the end value for the range of the data selection entry.
After you create the mapping parameters, use the mapping parameters in a dynamic filter
condition to represent the data selection you configured in the SAP BW InfoPackage. You
enter a dynamic filter for an SAP R/3 source from the Dynamic Filter tab of the ABAP
Program Flow dialog in the Application Source Qualifier properties. When you filter data
from an SAP R/3 source, the dynamic filter condition must conform to ABAP syntax.
Figure 18-1 shows the data selection entry in the SAP ABAP Program Flow dialog box:
Figure 18-1. Sample Dynamic Filter Condition in the ABAP Program Flow Dialog Box
When the SAP BW Scheduler sends the SAP BW Service a workflow request, the SAP BW
Service receives the data selection information for the SAP R/3 source and writes it to the
temporary parameter file. For example, the SAP BW Service writes the following to the
temporary parameter file for the data selection defined in Table 18-4:
$$MATNR_FROM_0=MR0842
$$MATNR_TO_0=MT0727
$$BLANZ_FROM_0=219
During the workflow, the Integration Service uses the temporary parameter file to obtain
values for the $$MATNR_FROM_0, $$MATNR_TO_0, and $$BLANZ_FROM_0
mapping parameters in the data selection entry. The Integration Service then uses the data
selection entry to filter data from the source.
For more information about how to name a mapping parameter for SAP BW data selection
from SAP R/3 sources, see “Mapping Parameter for a Flat File or SAP R/3 Source” on
page 358. For more information about creating mapping parameters for an SAP BW data
selection, see “Configuring Mapping Parameters for SAP BW Data Selection” on page 357.
For more information about entering a filter condition for an SAP R/3 source, see “Entering a
Source Filter” on page 141. For more information about configuring and viewing data
selection entries in SAP BW, see “Step 2. Configure an InfoPackage” on page 364.
Figure 18-2. $$BWFILTERVAR Mapping Parameter for SAP BW Data Selection Entries
Table 18-6 shows the options you need to use to create the $$BWFILTERVAR mapping
parameter for data selection:
Option Description
Table 18-7 describes the components of a mapping parameter name for a flat file or an SAP
R/3 source:
Table 18-7. Components of Mapping Parameter Name for Flat File and SAP R/3 Sources
InfoObject Name Name of the field or InfoObject from which you want to filter data.
From | To “From” defines the start of a range or a single value. Use “From” when you specify the
FromValue field in the SAP BW data selection entry.
“To” defines the end of a range. Use “To” when you specify the ToValue field in the SAP
BW data selection entry.
number Distinguishes mapping parameters with similar names created for the same InfoObject.
Use 0 for a mapping parameter name created for the first time. Increment the number in
the mapping parameter name by one for each subsequent mapping parameter for the
same InfoObject. For example, if you have two data selection entries for the EmpID field
that specify different values for FromValue, use 0 and 1 for the number parameter
component.
For example, you configure an SAP BW InfoPackage to filter data to extract records where the
LocationID equals 24 or 19. Create two mapping parameters to represent the data selection
entries in the SAP BW InfoPackage:
♦ $$LocationID_From_0
♦ $$LocationID_From_1
When you want to filter data from a flat file or SAP R/3 source to load into SAP BW, select
the datatype for the mapping parameter based on the datatype of the InfoObject. When you
specify the precision for a mapping parameter, make sure that it is the same as the precision
defined for the corresponding SAP BW InfoObject.
Table 18-8 shows the datatype you need to use for a mapping parameter based on the datatype
of the InfoObject:
CHAR String
359
Overview
To load data into the SAP BW system, configure both PowerCenter and the SAP BW system.
Complete the following steps to load data into SAP BW:
1. Configure a workflow to load data into SAP BW. Create a session that uses a mapping
with an InfoSource target definition.
2. Configure an InfoPackage. Create an InfoPackage that associates the PowerCenter
session with the InfoSource.
3. Configure a process chain to load data. Create a process chain that links the InfoPackage
process, additional loading processes, and the ZPMSENDSTATUS ABAP program.
When the SAP BW Service starts, it communicates with the SAP BW system to register itself
as a server. The SAP BW Service waits for a request from the SAP BW system to start the
workflow. When the InfoPackage starts, the SAP BW system communicates with the
registered SAP BW Service and sends the workflow name to be scheduled with the Integration
Service. The SAP BW Service reads information about the workflow and sends a request to
the Integration Service to run the workflow. The Integration Service validates the workflow
name in the repository and the workflow name in the InfoPackage. The Integration Service
executes the session and loads the data into SAP BW.
You can view log events to help you track the interactions between PowerCenter and SAP
BW. For more information, see “Viewing Log Events” on page 370.
You can configure sessions that load data into SAP BW for workflow recovery. If a session
enabled for recovery fails, use the Workflow Manager or Workflow Monitor to recover the
PowerCenter workflow. When you recover a workflow, the Integration Service can resume
the session that failed. When the workflow completes running in recovery mode, the SAP BW
system can start a normal run of the workflow. You can use the Workflow Manager or
Workflow Monitor to start PowerCenter workflows for an SAP BW session in recovery mode
only.
Required/
Property Description
Optional
Domain Name Required Name of the PowerCenter domain for the Integration Service that runs the
for DI Service workflow.
Data Integration Required Name of the PowerCenter Integration Service that runs the workflow.
Service Name
Name of Folder Required Name of the PowerCenter folder containing the workflow.
Containing
Workflow
Session Name Optional PowerCenter session name. If you enter a session name, the Integration
Service runs only this session in the workflow. If you do not enter a session
name, the Integration Service runs the entire workflow.
You must enter a session name if you are filtering data from a relational source
before loading it into a BW target. For more information, see “Filtering Data to
Load into SAP BW” on page 352.
Figure 19-1. Configuring a Process Chain that Loads to the PSA Only
The process chain can also contain an InfoPackage that loads data to the PSA and additional
processes that load the data to data targets. Insert the ZPMSENDSTATUS ABAP program
after each loading process to ensure that the SAP BW Service receives status information at
each point in the process chain.
Figure 19-2 displays a process chain that loads to the PSA and to a data target:
Figure 19-2. Configuring a Process Chain that Loads to the PSA and a Data Target
1. From the Administrator Workbench in BW, click SAP Menu > Administration >
RSPC - Process Chains.
The Process Chain Maintenance Planning View window appears.
2. Click Create.
The New Process Chain dialog box appears.
3. Enter a unique name for the process chain and enter a description.
4. Click Enter.
The Insert Start Process dialog box appears.
5. Click Create.
The Start Process dialog box appears.
6. Enter a unique name for the start process variant and enter a description.
7. Click Enter.
The Maintain Start Process window appears.
8. Click Change Selections to schedule the process chain.
The Start Time window appears.
9. To schedule the process chain to run immediately after you execute it, click Immediate.
10. Click Save.
11. In the Maintain Start Process window, click Cancel.
12. In the Insert Start Process dialog box, click Enter.
The start process appears in the Process Chain Maintenance Planning View window.
1. In the Process Chain Maintenance Planning View window, click Process Types.
2. From the Process Types menu, click Load Process and Post-Processing > Execute
InfoPackage.
The Insert Execute InfoPackage dialog box appears.
3. For the Process Variants field, click the browse button to select the InfoPackage you
created.
4. Click Enter.
The InfoPackage process appears in the Process Chain Maintenance Planning View
window.
5. Click the start process description and drag to link the start process to the InfoPackage
process.
1. In the Process Chain Maintenance Planning View window, click Process Types.
2. From the Process Types menu, click General Services > ABAP Program.
The Insert ABAP Program dialog box appears.
3. Click Create.
The ABAP Program dialog box appears.
4. Enter a unique name for the ABAP program process variant and enter a description.
5. Click Enter.
The Process Maintenance: ABAP Program window appears.
6. In the Program Name field, click the browse button to select the ZPMSENDSTATUS
ABAP program.
7. Click Change next to the Program Variant field.
The ABAP: Variants - Initial Screen window appears.
8. Click Create.
The error number embedded in the message originates from the operating system.
The InfoPackage aborts if the SAP BW Service does not connect to the Integration Service
immediately.
If you have problems starting an InfoPackage, test the connection using the Administrator
Workbench.
1. From the SAP BW Administrator Workbench, click the Source Systems tab.
2. Right-click the source system and select Change.
3. Click the Test Connection button.
4. The RFC Connection Test returns a status screen that indicates the status and
description of the connection.
When I try to start a workflow containing an SAP BW session in the Workflow Manager,
nothing happens.
You cannot use the Workflow Manager to start or schedule a PowerCenter workflow with an
SAP BW session. You need to configure the workflow to run on demand. Create an
InfoPackage in the SAP BW system to schedule the workflow containing an SAP BW session.
Troubleshooting 373
I need to stop a workflow containing an SAP BW session.
To stop an SAP BW workflow, stop the PowerCenter workflow using pmcmd or in the
Workflow Monitor. You cannot stop an InfoPackage in SAP BW. For more information
about stopping a workflow, see “Working with Workflows” in the PowerCenter Workflow
Administration Guide.
The InfoPackage starts in SAP BW, but no message appears in the log for the PowerCenter
session.
This may occur if you have more than one SAP BW Service in the same environment using
the same Program ID. When you have multiple SAP BW Services in the same environment
using the same Program ID, the SAP BW Service which was started first receives the requests
from the SAP BW system.
When you do not see a message in the PowerCenter Administration Console or BW Monitor
log for an Infopackage that starts in SAP BW, verify if there are other SAP BW Services
connected to the SAP BW system. Check the log for the other SAP BW Services to see if the
InfoPackage started.
I ran a session to load data into SAP BW. However, the session status reported by SAP BW is
not the same as the session status reported by the Integration Service.
Status messages coming from SAP BW are not transported correctly to the Integration Service
when the load is successful but has zero rows.
In SAP BW, you can set the Traffic Light Color options to indicate success if there is no data.
SAP BW sends a status message of success to the Integration Service when a load is successful
but has zero rows. For more information, see the SAP BW documentation.
The SAP BW Service starts my filter session run, but the log includes an error message
“Error in opening parameter file....”
This only occurs on Windows. The permissions on the directory that contains the parameter
file are not set correctly.
Enable the appropriate read and write permissions for the parameter file directory. For more
information, see “Creating and Configuring the SAP BW Service” in the PowerCenter
Administrator Guide.
I ran a session that successfully loaded data into SAP BW, but the PowerCenter
Administration Console log includes irrelevant messages about the session.
This may occur if you entered an invalid value for the CONTEXT field in the
ZPMSENDSTATUS ABAP program when creating the process chain. You must enter BW
LOAD for the CONTEXT field in a process chain that loads data.
This may occur if the Packet Size session property is higher than the packet size configured in
the BW system or higher than the available memory on the node where the Integration
Service process runs. Decrease the value of the Packet Size property and run the session again.
Troubleshooting 375
376 Chapter 19: Loading Data into BW
Part IX: Appendixes
377
378
Appendix A
Virtual Plug-in
379
Virtual Plug-in (NullDbtype) Source Definition
The Virtual plug-in source definition is a place holder for an actual source definition in a
mapping. It does not receive data from a source. You use a Virtual plug-in source in a
mapping for mapping validation only.
The Virtual plug-in source definition contains a single port called FIELD1 of null-datatype.
The output of this port links to the Application Multi-Group Source Qualifier
transformation in the mapping.
The Designer creates a Virtual plug-in source definition when you generate the following
types of mappings in the Mapping Designer:
♦ RFC/BAPI No Input mapping. For more information, see “Working with RFC/BAPI No
Input Mappings” on page 232.
♦ RFC/BAPI Multiple Stream mapping. For more information, see “Working with RFC/
BAPI Multiple Stream Mappings” on page 236.
♦ RFC/BAPI Single Stream mapping. For more information, see “Working with RFC/BAPI
Single Stream Mappings” on page 246.
To create a Virtual plug-in target definition from a Virtual plug-in source definition:
Datatype Reference
383
SAP Datatypes
Table B-1 lists datatypes available in SAP and SAP BW systems:
ACCP Date Posting Period of 6 positions, the format is YYYYMM. In input and output, a point is
inserted between year and month, so the template of this datatype has the form
‘____.__’.
CHAR Text Character string with maximum length of 255. If longer fields are required, datatype
LCHR must be selected.
CUKY Text Currency Key of 5 positions containing the possible currencies referenced by CURR
fields.
CURR Numeric Currency field, having 1 to 17 positions, equivalent to a DEC amount field. A CURR field
must reference a CUKY field.
For P type, only 14 digits are allowed after decimal point.
DEC Numeric Maximum length of 31 positions. Counter or amount field with decimal point, sign, and
commas separating thousands.
For P type, only 14 digits are allowed after decimal point.
INT2 Numeric 2-byte integer between -32,767 to 32,767, only used for length fields; positioned
immediately in front of LCHR, and LRAW. With INSERT or UPDATE on the long field,
the database interface enters the length used in the length field and length is set at 5
positions.
INT4 Numeric 4-byte integer between -2,147,483,647 and 2,147,483,647. The length is set at 10
positions.
LANG Text Language key, field format for special functions of 1 position.
LCHR Text Long character string, with a minimum length of 256 characters. Must be at the end of
transparent table and must be preceded by a length field INT2.
NUMC Text Long character field of arbitrary length, with a maximum length of 255 positions. Only
numbers can be entered.
QUAN Text Quantity field, points to a unit field with format UNIT, maximum length of 17 positions.
For P type, only 14 digits are allowed after decimal point.
TIMS Date Time field (HHMMSS) of 6 positions, the display format is HH.MM.SS.
UNIT Text Units key of 2 or 3 positions, field containing the allowed quantity units referenced by
QUAN fields.
VARC Text Variable length character string, requires an INT2 length field. No longer supported from
Rel. 3.0.
Transformation
SAP Datatype Range for Transformation Datatype
Datatype
Transformation
SAP Datatype Range for Transformation Datatype
Datatype
Although the Integration Service converts most SAP datatypes successfully, you might need to
override the Application Source Qualifier properties for the following datatypes:
♦ Date and number datatypes. While PowerCenter converts the SAP datatype to its own
transformation datatype, there are some instances where you might want to edit the
Application Source Qualifier to ensure full precision of dates and numbers.
♦ Binary datatypes. PowerCenter provides limited support for binary data.
♦ CHAR, CUKY, and UNIT datatypes. PowerCenter treats the SAP datatype CHAR as
VARCHAR. PowerCenter trims trailing blanks for CHAR, CUKY, and UNIT data so you
can compare SAP data with other source data.
NUMC
NUMC is a numeric string that supports more positions than any of the PowerCenter
numeric datatypes. It holds only unsigned numeric strings with a maximum length of 255.
The Application Source Qualifier converts NUMC to Decimal. You can also configure a
Source Qualifier to convert this datatype to Double.
By default, the Integration Service treats all Decimal ports as Double and maintains precision
up to 15 digits. If NUMC has up to 28 digits, you can enable high precision in the session
Binary Datatypes
PowerCenter provides limited support for the binary datatypes RAW and LRAW. RAW holds
binary data up to 255 bytes. LRAW holds binary data with a minimum of 256 bytes.
PowerCenter can move binary data to a relational target, but it cannot transform it.
PowerCenter cannot move binary data to flat file targets.
To move binary data, connect the RAW or LRAW column to a compatible binary column in
the target definition. You can pass the binary data through other transformations, but you
cannot perform mapping logic on the binary data.
For example, connect a RAW column from the SAP R/3 source to the Application Source
Qualifier. The Application Source Qualifier uses the Binary transformation datatype. You can
then pass the binary column to binary columns in other transformations and finally to a RAW
column in Oracle. The SAP RAW datatype in SAP is compatible with the Oracle RAW
datatype. If you apply mapping logic to the binary datatype, the session fails.
Integer, String,
Double,
SAP BW Binary Date/Time Decimal Small Nstring,
Real
Integer Text, Ntext
The Integration Service transforms data based on the PowerCenter transformation datatypes.
The Integration Service converts all data to a CHAR datatype and puts it into packets of 250
bytes, plus one byte for a continuation flag. SAP BW receives data until it reads the
continuation flag set to zero. Within the transfer structure, SAP BW then converts the data to
the SAP BW datatype.
Date/Time Datatypes
The transformation Date/Time datatype supports dates with precision to the second. If you
import a date/time value that includes milliseconds, the Integration Service truncates to
seconds. If you write a date/time value to a target column that supports milliseconds, the
Integration Service inserts zeros for the millisecond portion of the date.
Binary Datatypes
SAP BW does not allow you to build a transfer structure with binary datatypes. Therefore,
you cannot load binary data from PowerCenter into SAP BW.
Numeric Datatypes
PowerCenter Connect for SAP NetWeaver BW Option does not support the INT1 datatype.
For numeric datatypes such as CURR, DEC, FLTP, INT2, INT4, and QUAN, the
Integration Service uses the precision of the SAP BW datatype to determine the length of the
data loaded into SAP BW. For example, if you try to load a value of -1000000000 into an
SAP BW field with the INT4 datatype, the Integration Service skips the row. This is because
the INT4 datatype supports data up to 10 bytes in length, but the -1000000000 value uses 11
bytes.
The Integration Service does not truncate extraneous bytes when loading data that exceeds
the length allowed by the field datatype. If a row contains data that exceeds the length allowed
by the field datatype in SAP BW, the Integration Service skips the row and writes the skipped
row and a corresponding error message to the session log.
DATS
You can pass any string, text, or date/time value to a DATS column. The Integration Service
converts data to the YYYYMMDD format.
If you pass strings to a DATS column, the Integration Service converts the data as follows:
‘02/01/1996’ 19960201
‘09/10/49’ Error
‘01-21-51’ Error
‘10023’ Error
‘Jan151999’ Error
If you pass dates to a DATS column, the Integration Service converts the dates as follows:
12/08/98 19981208
04/12/52 20520412
03/17/49 19490317
11/22/1998 19981122
TIMS
You can pass any string, text, or date/time value to a TIMS column. The Integration Service
converts the time portion of the string or date to the HHMMSS format.
‘09/23/1998’ 000000
If you pass dates to a TIMS column, the Integration Service converts the dates as follows:
12/08/98 000000
11/22/1998 19981122
395
Language Code Selection
SAP supports many languages, but your system may be configured to support only a subset of
the languages. When you configure the application connection to connect to the SAP or BW
system, you may have to specify the language code of the SAP or BW system.
Table C-1 shows which application connections require a language code:
SAP_ALE_IDoc_Reader No
SAP_ALE_IDoc_Writer Yes
SAP BW Yes
FTP No
The language code you select affects the following PowerCenter Connect for SAP NetWeaver
mySAP Option tasks:
♦ Import SAP metadata in the Designer. The Designer imports metadata in the language
you specify. The SAP system returns messages to the Designer in the language you specify.
♦ Install ABAP program. The SAP system returns messages to the Designer in the language
you specify.
♦ Run sessions. The ABAP program extracts data in the language specified in the application
connection. The SAP system also returns session log and server messages in the language
you specify. When you configure an application connection, you also select a code page.
For information about code page selection, see “Code Page Selection” on page 398.
The language code you select affects the following PowerCenter Connect for SAP NetWeaver
BW Option tasks:
♦ Import InfoSource definitions. The BW system returns messages to the Designer in the
language you specify.
♦ Run sessions. The BW system returns session log and server messages in the language you
specify. When you configure a database connection, you also select a code page. For
information about code page selection, see “Code Page Selection” on page 398.
Under the following conditions, SAP substitutes the default language of that particular SAP
or BW system:
♦ You leave the language code blank.
♦ You specify a valid language code, but the SAP or BW system you connect to does not
support that language.
Albanian SQ Latin1
MS1252
UTF-8
Belorussian BE UTF-8
Bulgarian BG ISO-8859-5
MS1251
UTF-8
Catalan CA Latin1
MS1252
UTF-8
Croatian HR ISO-8859-2
MS1250
UTF-8
Czech CS ISO-8859-2
MS1250
UTF-8
Danish DA Latin1
MS1252
ISO-8859-9
UTF-8
Dutch NL Latin1
MS1252
ISO-8859-9
UTF-8
Estonian ET ISO-8859-4
MS1257
UTF-8
Farsi FA ISO-8859-6
UTF-8
Finnish FI Latin1
MS1252
ISO-8859-9
UTF-8
German DE Latin1
MS1252
ISO-8859-9
UTF-8
Greek EL ISO-8859-7
MS1253
UTF-8
Hebrew HE ISO-8859-8
MS1255
UTF-8
Hungarian HU ISO-8859-2
MS1250
UTF-8
Italian IT Latin1
MS1252
ISO-8859-9
UTF-8
Japanese JA MS932
UTF-8
Korean KO MS949
UTF-8
Latvian LV ISO-8859-4
MS1257
UTF-8
Lithuanian LT ISO-8859-4
MS1257
UTF-8
Macedonian MK ISO-8859-5
MS1251
UTF-8
Norwegian NO Latin1
MS1252
ISO-8859-9
UTF-8
Polish PL ISO-8859-2
MS1250
Romanian RO ISO-8859-2
MS1250
UTF-8
Serbian SH ISO-8859-2
MS1250
UTF-8
Slovak SK ISO-8859-2
MS1250
UTF-8
Slovenian SL ISO-8859-2
MS1250
UTF-8
Swedish SV Latin1
MS1252
ISO-8859-9
UTF-8
Thai TH MS874
UTF-8
Ukrainian UK MS1251
UTF-8
Vietnamese VI MS1258
UTF-8
Glossary
407
Glossary Terms
ABAP
Proprietary language used by SAP. The Designer generates and installs ABAP code on the
application server to extract SAP data.
application server
Part of the three-tiered architecture of the SAP system. PowerCenter makes all requests
through the application server.
background process
A work process on the application server that the Integration Service uses to run file mode
sessions in the background, or batch mode.
BAPI
Business Application Programming Interface. An SAP programming interface to access SAP
from external applications that support the RFC protocol.
branch
The structure in the hierarchy that connects the nodes, extending from the root node to the
leaf nodes.
buffers
Shared memory area on the application server that holds query results.
cluster table
A table on the application server that does not have a one-to-one relationship with the related
table on the database server.
code block
Additional ABAP code you can add to the ABAP program. You can create code blocks in the
ABAP Program Flow dialog box in the Application Source Qualifier.
CPI-C
Common Program Interface-Communications. The communication protocol that enables
online data exchange and data conversion between the SAP system and PowerCenter. CPI-C
is used for stream mode sessions.
database server
Part of the three-tiered architecture of the SAP system. This server contains the underlying
database for SAP.
database views
Views on the application server that are based on views of transparent tables on the database
server. You can extract from a database view in the same way you extract from a transparent
table.
design-time transports
Transports you install and use in the development environment.
detail table
The SAP table that joins with the hierarchy. The detail table provides the data for the detail
range associated with the leaf nodes in the hierarchy.
development class
Structure in the SAP system that holds objects in the same development project. PowerCenter
creates the ZERP development class and holds all the PowerCenter objects.
dialog process
A work process on the application server that runs file mode sessions in the foreground.
exec SQL
Standard SQL that accesses the physical database. Use exec SQL to access transparent tables
and database views.
file mode
Use the file mode extraction method to extract SAP data to a staging file. File mode sessions
use RFC.
FROM_VALUE
The beginning range of values for the leaf node of the hierarchy. The Integration Service uses
either this column or the TO_VALUE column to join with the detail table.
functions
Generic modules in the SAP system. You insert SAP functions in the ABAP program to
extract source data.
gateway process
A work process on the application server that implements CPI-C protocol for stream mode
sessions.
hierarchy
Tree-like structure of metadata that defines classes of information. A hierarchy is composed of
nodes and branches.
IDoc
An Intermediate Document (IDoc) is a hierarchical structure that contains segments. Each
IDoc segment contains a header and data rows. You import IDoc source definitions in the
Source Analyzer.
InfoCube
A self-contained dataset in the SAP Business Information Warehouse (SAP BW) created with
data from one or more InfoSources.
InfoSource
A collection of data in the SAP Business Information Warehouse (SAP BW) that logically
belongs together, summarized into a single unit.
leaf nodes
The lowest nodes in the structure of the hierarchy. These nodes are keyed to the detail table
containing data.
nested loop
The syntax that Open SQL uses to extract data. You can generate the ABAP program using
nested loop by selecting the Force Nested Loop option in the Application Source Qualifier.
nodes
The structure at each level of the hierarchy. The highest level node is called the root node.
The lowest level nodes are called leaf nodes. Other level nodes are called nodes.
non-uniform hierarchy
A hierarchy that has different numbers of nodes across the branches.
open SQL
SQL written in ABAP used to query tables on the application server. Use open SQL to access
database views and transparent, pool, and cluster tables. When you join two or more sources
in one Application Source Qualifier, Open SQL uses a nested loop to extract data.
pool table
A table on the application server that does not have a one-to-one relationship with the related
table on the database server.
presentation server
The top layer of the SAP three-tiered architecture. The presentation server is usually a PC or
terminal that end-users access to enter or query into the SAP system.
qualifying table
The last table selected in the join order that you use to preface a join condition override.
RFC
Remote Function Call. A standard interface to make remote calls between programs located
on different systems. PowerCenter uses RFC each time it connects to the SAP system.
root node
The highest node in the structure of the hierarchy. It is the origin of all other nodes.
run-time transports
Transports you install in the development environment and then deploy to test and
production environments.
SAP BW
An SAP system containing the BW Enterprise Data Warehouse. The PowerCenter Connect
for SAP NetWeaver BW Option allows you to extract data from or load data into an SAP BW
system.
SAP BW Service
A PowerCenter application service that listens for RFC requests from the SAP BW system,
initiates workflows to extract from or load to the SAP BW system, and sends log events to the
PowerCenter Log Manager.
SAP DMI
SAP Data Migration Interface. Use the SAP interface to migrate data from legacy
applications, other ERP systems, or any number of other sources into SAP.
saprfc.ini
Connectivity file that enables PowerCenter to initiate RFC with the SAP and BW systems.
SETID
SAP term that provides a unique identifier for each hierarchy.
single-dimension hierarchy
A hierarchy with only one associated detail table. PowerCenter supports single-dimension
hierarchies.
static filter
A filter in the Application Source Qualifier to reduce the number of rows the ABAP program
returns. The Designer writes the static filter condition into the ABAP program as a WHERE
clause.
stream mode
Use the stream mode extraction method to extract SAP data to buffers. Data from the buffers
is streamed to the Integration Service. Stream mode sessions use CPI-C protocol.
structure variable
A variable in the ABAP program that represents a structure in the SAP system. Structures are
virtual tables defined in the SAP dictionary.
TO_VALUE
The end range of values for the leaf node of the hierarchy. The Integration Service uses either
this column or the FROM_VALUE column to join with the detail table.
tp addtobuffer
Transport system command used to add transport requests to a buffer prior to importing
them into the SAP system.
tp import
Transport system command used to import the transport requests into the SAP system.
transparent table
A table on the application server that has a table of matching structure on the database server.
transport
SAP system used to transfer development objects from one system to another. Use the
transport system when you install the PowerCenter development objects on SAP. See also
design-time transports on page 409 and run-time transports on page 412.
uniform hierarchy
A hierarchy with the same number of nodes in each of the branches.
variant
A BW structure that contains parameter values that the SAP BW system passes during
program execution.
work process
A process on the application server that executes requests. All PowerCenter requests for data
extraction go through a work process.
415
ACCP (SAP) BAPI (SAP)
definition 388 definition 408
Administration Console (SAP) basic IDoc type (SAP)
configuring SAP BW Service 56 identifying and verifying 297
ALE (SAP) BCI (SAP)
configuring in SAP 31 communication settings, deleting 44
definition 174, 194, 408 See business content integration
ALE configuration BCI listener mapping (SAP)
removing 44 basic IDoc type, identifying and verifying 297
APO (SAP) binary datatypes (SAP)
integrating 4 description 388
application connections (SAP) in the Application Source Qualifier 150
See also PowerCenter Workflow Administration Guide branch (SAP)
copying mappings 264 definition 408
Application Multi-Group Source Qualifier (SAP) buffer block size (SAP)
creating 177, 381 configuring 361
definition 381 buffers (SAP)
description 177 definition 408
application server (SAP) business components (SAP)
definition 408 creating 84
generate and install ABAP program 99 organizing the Navigator 83
importing source definitions 64 business content (SAP)
importing table definitions 65 See also business content integration
running sessions 154 definition 409
Application Source Qualifier (SAP) business content integration (SAP)
ABAP code blocks 135 DataSources 288, 294, 302
ABAP program variables 137 definition 288
changing datatypes 387 importing PowerCenter objects 295
configuring 150 listener workflow 299
creating 150 logical system 35
datatypes 150, 151 logical system in SAP 288
generating ABAP 121 mappings 289
generating SQL 122 processing mappings 301
joining sources 127 processing workflows 314
mapping variables 145 request files 312
source filters 141 scheduling 290, 316
applications (SAP) send request workflows 313
integrating 4 workflows 290
authority checks (SAP) business names (SAP)
adding 97 hierarchies 68
creating users 27 tables and views 65
running sessions 154, 206, 264 business solutions (SAP)
troubleshooting 169 integrating 4
authorization (SAP) BW data extraction (SAP)
See profiles Open Hub Service 8
BW hierarchy (SAP)
child node 338
B configuring 343
description 338
background process (SAP)
interval hierarchy 339
definition 408
leaf node 338
running file mode sessions 158
416 Index
root nodes 338 CPI-C (SAP)
sorted hierarchy 339 configuring sideinfo 39
time dependent hierarchy 339 definition 409
version-dependent hierarchy 339 integration process 5
BW Monitor (SAP) overview 12
log messages 370 running stream mode sessions 156
BW Option (SAP) CRM (SAP)
configuring 46 integrating 4
performance 361 CUKY (SAP)
upgrading 47 trimming trailing blanks 389
C D
caches (SAP) data extraction (SAP)
DMI data 214 process chains 328
inbound IDoc data 214 scheduling 328
session cache files 214 Data Migration Interface (SAP)
changing parameters (SAP) See SAP DMI 409
configuring 115 data selection entry (SAP)
definition 113, 228 configuring 365
CHAR (SAP) database server (SAP)
trimming trailing blanks 389 definition 409
child node (SAP) generating open SQL 122
description 338 importing table definition 65
cluster table (SAP) database views (SAP)
definition 409 definition 409
code blocks (SAP) extracting 65
See ABAP code blocks generating SQL 123, 124
code pages (SAP) datafiles (SAP)
selecting 398 definition 24
cofiles (SAP) DataSources (SAP)
definition 24 activating 294
column (SAP) non-hierarchy and hierarchy 302
DATS 392 overview 288
TIMS 392 datatypes (SAP)
commit call (SAP) ACCP 388
configuring for inbound IDocs 213 Application Multi-Group Source Qualifier 177
Common Program Interface-Communications (SAP) Application Source Qualifier 150
See CPI-C binary 391
connections (SAP) binary datatypes 388
testing 373 CHAR 389
connectivity (SAP) converting 390
communication interfaces 12 CUKY 389
importing source definitions 78 date/time 391
installing ABAP programs 99 DATS 388
integration overview 5, 8 filter conditions 143, 144
saprfc.ini file 41 INT1 391
sideinfo file 39 native 387
continuous workflows (SAP) numeric 391
See also PowerCenter Workflow Administration Guide overriding 151, 387
description 211 SAP 384
Index 417
transformation 386
trimming trailing blanks 389
E
UNIT 389 environment variables (SAP)
using 386 setting PMTOOL_DATEFORMAT 370
date format (SAP) ERP (SAP)
converting 146 integrating 4
key ranges for partitioning 165 error handling (SAP)
DATE_CONVERT_TO_FACTORYDATE (SAP) IDoc sessions 222
description 109 RFC/BAPI sessions 269
DATS (SAP) error messages (SAP)
description 388 BW sessions 328, 366
definitions in the navigator (SAP) exec SQL (SAP)
organizing 83 available generation mode 121
design-time transports (SAP) available join type 127
definition 409 definition 410
DEST (SAP) generating 123
generating and installing ABAP programs 99 joining sources 128
importing sources 78 overriding join conditions 133
detail ranges (SAP) Select options 89
FROM_VALUE 72 export parameters (SAP)
hierarchies 72 configuring 115
TO_VALUE 72 external logical system (SAP)
detail table (SAP) See logical system
definition 409
joining with hierarchies 73, 132
properties 93 F
development class (SAP) file direct (SAP)
$TMP 29, 99 staging files 161
creating 29 file lists (SAP)
definition 409 BCI request files 312
ZERP 24 file mode (SAP)
development system (SAP) ABAP program 96
installing and configuring 22 configuring session properties 163
development user (SAP) definition 410
creating profiles for mySAP 28 running sessions 158
creating profiles for SAP BW 48 sessions 154
dialog process (SAP) file persistence (SAP)
definition 409 staging files 159
file mode sessions 158 file reinitialize (SAP)
stream mode sessions 156 staging files 159
DMI documents (SAP) filtering data (SAP)
foreign key 275 $$BWFILTERVAR 352
primary key 275 configuring BW data selection entry 365
dynamic filters (SAP) description 352
condition 142 from flat file sources 353
definition 410 from relational sources 352
handling 141 from SAP R/3 sources 353
filters (SAP)
condition 143
difference between dynamic and static 141
dynamic filters 141
418 Index
entering 141 joining with tables 73, 132
handling 141 nodes 72
ISO 8859-1 characters 141 non-uniform hierarchies 70
multibyte characters 141 overview 68
NUMC datatype 144 running sessions 154
static filters 141, 143 sample definition 72
syntax rules 143 session modes 154
flat file sources (SAP) SETID 68, 73
filtering data from 353 structure 72
mapping parameters 358 TO_VALUE 74
FROM_VALUE (SAP) uniform hierarchies 68
definition 410 viewing properties 93
detail ranges 72 hierarchy DataSources (SAP)
hierarchy relationship 74 processing mappings 302
FTP (SAP) high availability (SAP)
staging filenames 163 restriction 208
staging files 162
FunctionCall_MS mapplet (SAP)
RFC/BAPI Multiple Stream mapping 243 I
FunctionCall_No_Input mapplet (SAP)
Idle Time (SAP)
RFC/BAPI No Input mapping 233
configuring 217
FunctionCall_SS mapplet (SAP)
description 208
RFC/BAPI Single Stream mapping 251
IDoc definitions (SAP)
functions (SAP)
viewing 76
See SAP functions
IDoc sessions (SAP)
error handling 222
G troubleshooting 224
IDoc transfer method (SAP)
gateway process (SAP) definition 410
definition 410 described 341
generating SQL (SAP) IDocs (SAP)
available generation mode 121 ABAP program flow restrictions 147
glossary (SAP) attributes 76
terms 408 control information 76, 147
grid (SAP) definition 410
restriction 208 editing IDoc type 94
groups (SAP) entering an IDoc filter 147
IDocs 180 filter condition syntax 149
foreign key 196
importing IDoc definitions 75
H importing IDocs with the same name 94
joining with tables 132
hierarchies (SAP)
overview 6
creating columns 68
primary key 196
definition 410
properties 77
detail ranges 72
segments and groups 180
establishing relationships 73
type 77
extracting 68
validating the IDoc filter condition 149
FROM_VALUE 74
import parameters (SAP)
importing 78
configuring 115
importing definitions 68
definition 113
Index 419
inbound IDoc mapping (SAP) hierarchies 93
configuring 203 open SQL 127
inbound IDocs (SAP) tables and hierarchies 132
configuring a session for invalid IDocs 213 tables and IDocs 132
document number, passing values to 203
processing invalid IDocs 203
syntax validation 203 K
industry solutions (SAP)
key relationships (SAP)
integrating 4
importing 66
InfoCube (SAP)
definition 410
InfoPackage (SAP)
creating 364
L
definition 411 language codes (SAP)
session names 360 selecting 396, 398
troubleshooting 373 LCHR datatype (SAP)
workflow names 362 using with Select Distinct 90
InfoSource (SAP) leaf nodes (SAP)
3.x InfoSources 338 See also nodes
activating 346 definition 411
assigning logical system 345 description 338
creating 343 libraries (SAP)
definition 411 Unicode for multiple SAP sessions 403
for master data 338 Unicode for single SAP sessions (SAP) 403
for transaction data 338 listener workflow (SAP)
importing 349 configuring 299
loading to SAP BW 7.0 338 LMAPITarget (SAP)
InfoSpoke (SAP) application connection 297
creating 325 load balancing (SAP)
inner join (SAP) SAP BW Service 56
ABAP join syntax 129 support for BW system 56
installation (SAP) Type B entry 41, 53
changes to BW Option 46 log events (SAP)
changes to mySAP Option 16 See also PowerCenter Administrator Guide
integration methods (SAP) See also PowerCenter Workflow Administrator Guide
BW Option 8 viewing 332, 370
mySAP Option 5 logical system (SAP)
interfaces (SAP) ALE integration 31
overview 12 assigning to InfoSource 345
interval hierarchy (SAP) business content integration 35
description 339 BW extraction and loading 52
program ID 32, 35, 52
ZINFABCI program 35
J logical unit of work (SAP)
definition 411
join conditions (SAP)
LRAW datatype (SAP)
rules 133
using with Order By 91
specifying 133
joining sources (SAP)
ABAP join syntax 129
available join types 127
exec SQL 128
420 Index
M non-hierarchy DataSources (SAP)
processing mappings 302
mapping parameters (SAP) non-uniform hierarchies (SAP)
$$BWFILTERVAR 352 definition 411
description 145 overview 70
flat file sources 358 NUMC datatype (SAP)
relational sources 357 description 387
SAP R/3 sources 358 filters 144
mapping shortcuts (SAP)
installing ABAP programs 95
mapping variables (SAP) O
built-in 145
Open Hub Service (SAP)
date format 146
description 8
sample filter condition 145
open SQL (SAP)
user-defined 145
available generation mode 121
mappings (SAP)
available join type 127
creating 326, 351
definition 411
maximum processes (SAP)
generating 122
See also PowerCenter Administration Guide
joining sources 127
configuring for SAP RFC/BAPI function mappings
overriding join condition 133
265
select options 89
message recovery (SAP)
Order By (SAP)
See also PowerCenter Workflow Administration Guide
cluster tables 92
description 210
exec SQL 91
specifying partitions and a recovery cache folder 211
open SQL 91
using with certified messages 210
pool tables 92
mySAP applications (SAP)
source definitions 91
integrating 4
transparent tables 91
mySAP Option (SAP)
using the LRAW datatype 91
configuring 16
outbound IDoc mapping (SAP)
uninstalling 44
configuring a session 207
upgrading 16
outbound IDocs (SAP)
configuring a session for invalid IDocs 211
N processing invalid IDocs 191
session behavior 174
namespace (SAP) session restriction 208
naming ABAP programs 97 syntax validation 191
native datatypes (SAP) upgrading 191
description 387 outer join (SAP)
nested loop (SAP) ABAP join syntax 129
definition 411 using multiple outer joins 132
NFS (SAP) Output is Deterministic property (SAP)
staging files 161 SAP/ALE IDoc Interpreter transformation 189
nodes (SAP)
child node 338
definition 411 P
hierarchies 72
Packet Count (SAP)
leaf node 338
configuring 217
leaf nodes 68
description 208
root nodes 68, 338
high availability 208
root nodes definition 412
Index 421
packet size (SAP) program ID (SAP)
configuring for BW session 361 ALE integration 32
configuring for IDoc sessions 213 business content integration 35
partner profile (SAP) BW extraction and loading 52
ALE integration 33 logical system 32, 35, 52
business content integration 35 saprfc.ini file 41, 53
performance (SAP) program information (SAP)
configuring buffer block size 361 copying 104
loading to BW 341 PSA transfer method (SAP)
permissions (SAP) definition 412
running sessions 154, 206, 264 described 341
pipeline partitioning (SAP)
See also PowerCenter Workflow Administration Guide
configuring for inbound IDoc sessions 212 Q
configuring for outbound IDoc sessions 211
qualifying table (SAP)
configuring in the session properties 168
definition 412
restrictions for sources 165
specifying partitions and a recovery cache folder 211
stream mode sessions 157
PMTOOL_DATEFORMAT (SAP)
R
setting environment variables 370 Reader Time Limit (SAP)
pool table (SAP) configuring 217
definition 411 description 210
PowerCenter workflows (SAP) Real-time Flush Latency (SAP)
creating 327, 361 configuring 217
InfoPackage 364 configuring continuous workflows 211
monitoring 332, 370 description 209
recovering 372 real-time sessions (SAP)
Prepare mapplet (SAP) configuring source-based commit 209
RFC/BAPI Multiple Stream mapping 240 defined 209
presentation server (SAP) setting Real-time Flush Latency 209
definition 411 recovery (SAP)
process chain (SAP) See message recovery
definition 412 enabling 361
process chains (SAP) failed sessions 372
configuring 328 relational sources (SAP)
loading data 366 filtering data from 352
troubleshooting 333 mapping parameters 357
processing mappings (SAP) remote function call (SAP)
creating 301 See RFC
processing workflows (SAP) request files (SAP)
creating 314 deploying 312
production system (SAP) reusing (SAP)
installing and configuring 22 staging files 159
production user (SAP) RFC (SAP)
creating profiles for mySAP 28 configuring saprfc.ini 41
creating profiles for SAP BW 48 definition 412
profiles (SAP) integration process 5, 8
creating for mySAP 28 overview 12
creating for SAP BW 48 running file mode sessions 158
setting for RFC/BAPI function mappings 29 troubleshooting 333
422 Index
RFC/BAPI function mappings (SAP) SAP BW (SAP)
authorization 29 definition 412
change data in SAP 229 Unicode 403
configuring a session 265 SAP BW Service (SAP)
copying 264 creating 56
custom return structures 253 definition 412
description 228 viewing log events 332, 370
function input data 261 SAP DMI (SAP)
generating 255 definition 412
writing data to SAP 229 SAP DMI Prepare transformations (SAP)
RFC/BAPI Multiple Stream mapping (SAP) creating 277
definition 236 editing 280
FunctionCall_MS_mapplet 243 SAP functions (SAP)
Prepare mapplet 240 ABAP program flow 125
RFCPreparedData source 241 changing parameters 113, 228
RFCPreparedData target 241 configuring function parameters 115
source definitions 238 DATE_CONVERT_TO_FACTORYDATE 109
Source Qualifier transformation 239, 242 definition 410
target definitions 244 export parameters 113
Transaction_Support mapplet 243 import parameters 113
RFC/BAPI No Input mapping (SAP) importing 111
definition 232 inserting in ABAP program flow 114
FunctionCall_No_Input mapplet 233 parameters 109
source definitions 233 rules 117
Source Qualifier transformation 233 table parameters 113
target definitions 235 validating 117
RFC/BAPI sessions (SAP) versioning 109
error handling 269 viewing 113
RFC/BAPI Single Stream mapping (SAP) SAP NetWeaver (SAP)
definition 246 application platform 4
FunctionCall_SS mapplet 251 SAP R/3 source definitions (SAP)
Sorter transformation 251 importing 64, 78
source definitions 247 importing IDoc definitions 75
Source Qualifier transformation 250 importing key relationships 66
target definitions 252 SAP R/3 sources (SAP)
RFCPreparedData source (SAP) filtering data from 353
RFC/BAPI Multiple Stream mapping 241 mapping parameters 358
RFCPreparedData target (SAP) SAP sessions (SAP)
RFC/BAPI Multiple Stream mapping 241 troubleshooting 169
root nodes (SAP) SAP system (SAP)
description 338 importing source definitions 64
See nodes SAP system variables (SAP)
run-time transports (SAP) See variables
definition 412 SAP transparent table (SAP)
creating a mapping 326
SAP/ALE IDoc Interpreter transformations (SAP)
S creating 182
editing 189
SAP 3.x system (SAP)
upgrading 191
ABAP program name 97
SAP/ALE IDoc Prepare transformation (SAP)
SAP 4.x system (SAP)
DOCNUM port 203
ABAP program name 97
Index 423
SAP/ALE IDoc Prepare transformations (SAP) sessions (SAP)
creating 198 CPI-C 154
editing 198 failed 372
SAPALEIDoc source definitions (SAP) file direct 161
description 176 file mode 154
using in a mapping 176 FTP 162
SAPALEIDoc target definitions (SAP) NFS 161
creating 202 overview 154, 206, 264
saprfc.ini (SAP) partitioning 165
configuring 41 read permissions 154, 206, 264
definition 412 recovering 372
DEST 58 RFC 154
entries 41, 53 stream mode 154
renaming 30, 51 troubleshooting 333, 373
sample entry 41 SETID (SAP)
sample file 53 definition 412
scheduling (SAP) hierarchies 68
BCI 290, 316 sideinfo (SAP)
security (SAP) renaming 30
accessing staging files 162 sideinfo file (SAP)
mySAP authorizations 28 configuring 39
SAP BW authorizations 48 creating entries in the services file 40
segments (SAP) definition 413
IDocs 180 sample entry 39
select distinct (SAP) single-dimension hierarchy (SAP)
overview 90 definition 413
using LCHR datatype 90 sorted hierarchy (SAP)
select single (SAP) description 339
description 89 Sorter transformation (SAP)
Send IDocs Based On (SAP) RFC/BAPI Single Stream mapping 251
configuring 213 source definitions (SAP)
send request workflows (SAP) configuring Order By 91
creating 313 editing 81
services file (SAP) importing 64, 78
creating entries 40 importing hierarchies 68
session conditions (SAP) importing tables and views 65
Idle Time 208 organizing the Navigator 83
Packet Count 208 RFC/BAPI Multiple Stream mapping 238
Reader Time Limit 210 RFC/BAPI No Input mapping 233
Real-time Flush Latency 209 RFC/BAPI Single Stream mapping 247
session properties (SAP) SAPALEIDoc source definition 176
Cache Directory 221 transformation properties 163
Cache Size 221 troubleshooting 85
configuring for IDoc mappings 216 source directory (SAP)
configuring for mappings with SAP R/3 sources 166 session properties 163
Duplicate Parent Row Handling 221 Source Qualifier transformation (SAP)
Extended Syntax Check 221 RFC/BAPI Multiple Stream mapping 239, 242
NULL Field Representation 221 RFC/BAPI No Input mapping 233
Orphan Row Handling 221 RFC/BAPI Single Stream mapping 250
source directory 163 source system (SAP)
staging directory 163 See logical system
424 Index
Source transformation properties (SAP) sy-datum (SAP)
description 150 description 140
sources (SAP) syntax validation (SAP)
hierarchies 68 inbound IDocs 203
importing definitions 65 outbound IDocs 191
tables 65, 122 system variables (SAP)
views 65 See variables
SQL (SAP) assigning initial value 140
ABAP join syntax 124 creating 140
Application Source Qualifier 122
exec SQL 123
joining sources 127 T
open SQL 122
table parameters (SAP)
staging directory (SAP)
configuring 115
file mode session 163
overview 113
session properties 163
tables (SAP)
staging files (SAP)
cluster 65, 122
accessing on UNIX 162
description 65
configuring session properties 163
detail 68, 73
establishing access 161
importing definitions 65, 78
file direct 161
joining with hierarchies 132
file persistence 159
joining with IDocs 132
file reinitialize 159
pool 65, 122
FTP 162
transparent 65, 122
NFS 161
target definitions (SAP)
preparing the staging directory 163
RFC/BAPI Multiple Stream mapping 244
reusing 159
RFC/BAPI No Input mapping 235
session properties 163
RFC/BAPI Single Stream mapping 252
static filters (SAP)
SAPALEIDoc target definition 202
condition 143
test system (SAP)
definition 413
installing and configuring 22
handing 141
time-dependent hierarchy (SAP)
stream mode (SAP)
description 339
ABAP program 96
$TMP (SAP)
ABAP program name 97
development class 29
configuring partitioning 157
installing ABAP programs 99
configuring sideinfo 39
TO_VALUE (SAP)
creating entries in the services file 40
definition 413
definition 413
hierarchy definition 72
running sessions 156
hierarchy relationship 74
sessions 154
tp addtobuffer (SAP)
strings (SAP)
definition 413
date format 392
tp import (SAP)
structure field variables (SAP)
definition 413
creating 137
transporting development objects 24
definition 137, 413
Transaction_Support mapplet (SAP)
using system variables 140
RFC/BAPI Multiple Stream mapping 243
structure variables (SAP)
transactional RFC (SAP)
creating 137
definition 414
definition 137, 413
transfer methods (SAP)
IDoc 341
Index 425
PSA 341 Guide
transfer rules (SAP) mySAP Option 44
setting 346 SAP environment, cleaning 44
transformation datatypes (SAP) transport objects 44
description 387 uninstalling ABAP program (SAP)
transparent table (SAP) description 102
definition 413 UNIT (SAP)
transport (SAP) trimming trailing blanks 389
definition 413 user-defined commit (SAP)
overview 13 configuring for inbound IDocs 213
transport programs (SAP) UTF-16 (SAP)
YPMPARSQ 26 SAP Unicode data 403
transports
SAP, deleting 22
transports (SAP) V
deleting transport objects 44
validation (SAP)
import 24
ABAP code blocks 136
installing 24
ABAP program flow 126
TreatCHARasCHARonRead (SAP)
SAP functions 117
trimming trailing blanks 389
variables (SAP)
tRFC (SAP)
ABAP program variables 126, 137
definition 414
ABAP type variables 138
troubleshooting (SAP)
date format 146
authority checks 169
mapping variables 145
BW sessions 333, 373
naming conventions 137
extracting data from BW 333
SAP system variables 140
IDoc sessions 224
structure field variables 137
importing source definitions 85
structure variables 137
InfoPackage 373
version-dependent hierarchy (SAP)
installing ABAP programs 106
description 339
loading data into BW 373
versioned mappings (SAP)
Type A (SAP)
See also PowerCenter Repository Guide
entry in saprfc.ini 41, 53
installing an ABAP program 98
Type B (SAP)
purging a mapping with an ABAP program 98
entry in saprfc.ini 41, 53
undoing a checkout with an ABAP program 98
Type R (SAP)
uninstalling an ABAP program 98
entry in saprfc.ini 41, 53
versioned objects (SAP)
See versioned mappings
U versioning (SAP)
SAP functions 109
Unicode (SAP) views (SAP)
entering source filters 141 description 65
libraries for multiple SAP sessions 403 extracting 65
libraries for single SAP sessions 403 generating SQL 123, 124
SAP support 403 importing definitions 65, 78
uploading ABAP programs 99 Virtual plug-in source definition (SAP)
uniform hierarchies (SAP) definition 380
definition 414 Virtual plug-in target definition (SAP)
examples 68 creating 382
uninstalling (SAP) definition 382
See also PowerCenter Installation and Configuration
426 Index
W
work process (SAP)
definition 414
workflows (SAP)
See PowerCenter workflows
overview 206
running 223
stopping 373
Y
YPMPARSQ (SAP)
description 26
Z
ZERP (SAP)
description 24
ZINFABCI (SAP)
logical system 35
ZINFABCI transaction (SAP)
communication settings, deleting 44
ZPMSENDSTATUS (SAP)
configuring for data extraction 328
configuring for data loading 366
creating a variant 329
description 59
Index 427
428 Index