Sei sulla pagina 1di 20

6/12/2019

Modeling
Generated on: 2019-06-12

SAP NetWeaver 7.4 | 7.4.16

PUBLIC

Warning

This document has been generated from the SAP Help Portal and is an incomplete version of the official SAP product
documentation. The information included in custom documentation may not re ect the arrangement of topics in the SAP Help
Portal, and may be missing important aspects and/or correlations to other topics. For this reason, it is not for productive use.

For more information, please visit the SAP Help Portal.

This is custom documentation. For more information, please visit the SAP Help Portal 1
6/12/2019

Distributing Data to Other Systems


Use
As well as data staging and data processing, the data warehousing capabilities in BW also offer processes for distributing data.

You can distribute data both within the BW system and to other systems. In the latter case, the BW system is the source, or hub,
of the data transfer.

Features
The tool for distributing between BW systems is the Data Mart Interface.

More information: Exchanging Data Between BW Systems.

There are various tools for distributing and extracting data to other SAP or non-SAP systems, including the Open Hub
Destination.

More information: Extracting Data from the BW System

Exchanging Data Between BW Systems.


Use
The data mart interface makes it possible to post data from one InfoProvider to another.

You have the following options:

Exchanging data between multiple BW systems: The sending system is referred to as the source BW, while the receiving
system is referred to as the target BW. The individual BW systems arranged in such a way are called data marts. The
InfoProviders of the source BW are used as the data source.

Data exchange between BW systems and other SAP systems

For more information see Data Mart Interface Between Several Systems.

Data marts can be used in different ways:

They save a subset of the data for a Data Warehouse in another database, possibly even at a different location.

They are smaller units of a Data Warehouse.

They are stored as intentionally redundant segments of the (logical, global) overall system (Data Warehouse).

Features
A BW system makes itself available to another BW system as the source system by

providing metadata

providing transaction and master data.

This is custom documentation. For more information, please visit the SAP Help Portal 2
6/12/2019
An export DataSources is needed to transfer data from a source BW into a target BW. Export DataSources for InfoCubes and
DataStore objects contain all the characteristics and key gures of the InfoProvider. Export DataSources for master data
contain the metadata for all attributes, texts, and hierarchies for an InfoObject.

Constraints
Changes to the metadata of the source system can only transferred to the export DataSources by regenerating these
DataSoures.

The Delete function is not supported at present.

You can only generate an export DataSource from an InfoCube if:

the InfoCube is activated

the name of the InfoCube is at least one character shorter than the maximum length of a name, since the
DataSource name is made up of the InfoCube name and a pre x.

Note the special features of the assignment of a source system to a source system ID in data mart scenarios.

More information: Assigning a Source System to a Source System ID

Data Mart Interface Between Several


Systems
Use
The data mart interface can be used between two BW Systems or between another SAP system and a a BW System:

to enable large amounts of data to be processed

to increase the respective speeds of data transfer, logging and analysis.

to achieve improved concision and maintenance of the individual Data Warehouses.

to enable data separation relating to the task area, on the one hand, and to be able to perform analyses of the state of
the entire dataset on the other.

to make it possible to have fewer complexities when constructing and implementing a Data Warehouse.

to construct hub and spoke scenarios in which a BW System stands in the middle and the data from distributed systems
runs together and is standardized.

Integration
Data marts are found both in the maintenance and in the de nition, the same as in the SAP source system. Here too, you can
group together data from one or more source systems in a BW System, or continue to work in several BW Systems.

Features
These functions can produce different architectures in a Data Warehouse landscape.

Replicating architecture:

This is custom documentation. For more information, please visit the SAP Help Portal 3
6/12/2019
If you select this architecture, the data for a BW server is available as source data and can be updated in further target BW
systems.

Aggregating Architecture:

With aggregating architecture, data is grouped together from two or more BW servers, and is then available for further
processing.

Using the Data Mart Interface


Use
This is custom documentation. For more information, please visit the SAP Help Portal 4
6/12/2019
If you want to de ne an existing BW system as a source system, you use the functions of the Data Mart interface.

Procedure
1. In the source BW system

Before a BW system can request data from another BW system, it must have information about the structure of the data to be
requested. For this, you have to upload the metadata from the source BW into the target BW.

You generate an export DataSource for the respective InfoProvider in the source BW. This export DataSource includes an
extract structure that contains all the characteristics and key gures of the InfoProvider.

For InfoCubes/DataStore Objects: Generating Export DataSources for InfoProviders

For InfoObjects: Generating Master Data Export DataSources

2. In the target BW system

1. De ne the source System if required: Creating SAP Source Systems This is only necessary if the source system has not
yet been created in the target BW.

2. Create the InfoProvider that you want the data to be loaded into.

3. Copy the metadata of your export DataSource from your source BW into your target BW. You can copy all the metadata
in a source system using the source system tree, or you can choose to copy only the metadata of your DataSource.

More information: Replication of DataSources.

4. Activate the DataSource.

5. Create an InfoPackage using the context menu of your DataSource. Your source BW system is speci ed as the source by
default.

More information: Creating InfoPackages

6. Using the context menu for your DataSource, create a data transfer process with the InfoProvider into which the data is
to be loaded as the target. A default transformation is created at the same time.

More information: Creating Data Transfer Processes

This is custom documentation. For more information, please visit the SAP Help Portal 5
6/12/2019
The complete data ow is displayed in the InfoProvider tree under your InfoProvider.

7. Schedule the InfoPackage and the data transfer process. We recommend always using process chains for loading
processes.

More information: Creating Process Chains

Result
You have loaded data from one BW system into another BW system.

Generating Export DataSources for


InfoProviders
Use
The export DataSource is needed to transfer data from a source BW system into a target BW system. You can use the selected
InfoProvider as a DataSource for another BW system.

Prerequisites
The source BW system must be created and actively saved in the target BW system as BW source system. More information:
Creating SAP Source Systems

Procedure
1. In the Data Warehousing Workbench in the BW source system, choose Modeling and select the InfoProvider tree.

2. Generate the export DataSource using the context menu of your InfoProvider. To do this, choose Additional
Functions Generate Export DataSource .

Note
The technical name of the export DataSource is made up of the number 8 and the name of the InfoProvider.

 Example
Technical name of an InfoCube: COPA

Technical name of the export DataSource: 8COPA

Generating Master Data Export DataSources


Prerequisites
The source BW system must be created and actively saved in the target BW system as the BW source system.

More information: Creating SAP Source Systems

This is custom documentation. For more information, please visit the SAP Help Portal 6
6/12/2019

Context
The export DataSource is needed to transfer data from a source BW system into a target BW system.

You can generate an export DataSource for master data, and thus for individual InfoObjects. In this way all the metadata is
created that is required to extract master data (all attributes, texts, and hierarchies) from an InfoObject.

Procedure

1. Select the InfoObject tree in the Data Warehousing Workbench in the source BW system.

2. Generate the export DataSource from the context menu of your InfoObject. To do this, choose Additional
Functions Generate Export DataSource .

Note
The technical name of the export DataSource is

8******M for attributes (M stands for 'master data')

8******T for texts

8******H for hierarchies

(The asterisks (*) stand for the source InfoObject.)

Therefore, when you create an InfoObject or a master data InfoSource in the source BW system, make sure that the
length of the technical name of each object is no longer than 28 characters.

Data Transfer Using the Data Mart Interface


Use
The data transfer is the same as in an SAP system. The system reads the data, from the fact table of the BW system delivering
the data, taking into account the speci ed dimension-speci c selections.

Delta Procedure

Using the data mart interface, you can transfer data by full upload as well as by delta requests.

Note
You cannot enter selections for init and delta requests. Selections can be made and evaluated for full requests.

A distinction is made between InfoCubes and DataStore objects.

The InfoCube that is used as an export DataSource is rst initialized, meaning that the current status is transferred into
the target BW system. When the next upload is performed, only requests added since the initialization are transferred.
Different target systems can also be lled like this.

Note

This is custom documentation. For more information, please visit the SAP Help Portal 7
6/12/2019
Only requests that have been rolled up successfully into the aggregates are transferred. If no aggregates were used,
only requests that are set to 'Qualitatively OK' in InfoCube Administration are transferred.

For DataStore objects, the requests in the change log of the DataStore object are used as the basis for the delta
determination. Only change log requests resulting from reactivation of the DataStore object data are transferred.

Restriction:

You can only make one selection for each target system for the delta.

 Example
You rst make a selection using cost center 1 and load deltas for this selection. At a later time, you also decide to load a
delta for cost center 2 in parallel to the cost center 1 delta The delta can only be requested in full for both centers; it is then
impossible to execute the deltas separately for the different selections.

Exchanging Data Between BW Systems


Using the ODP Source System
The Operational Data Provisioning framework allows you to make use of the ODP source system to use the same technology for
a data mart scenario between BW systems as you would for providing other SAP source data for the BW system.

The sending system is referred to as the source BW, while the receiving system is referred to as the target BW. The availability
of InfoProviders as the providers of the operational delta queue and of ODP context BW in the source BW system make it
possible to make InfoProviders of the source BW system in the target BW system as operational data providers available for the
data transfer.

The following InfoProviders are supported for the data mart scenario:

DataStore objects

InfoCubes

Semantically partitioned objects

HybridProviders

MultiProviders

InfoSets

InfoObjects for master data, texts and hierarchies

Queries as InfoProviders

Prerequisites
The source BW system has release SAP NetWeaver 7.4 SPS 05 or higher.

The target BW has one of the following releases or higher:

SAP NetWeaver 7.3 SPS 08 and SAP Note 1780912

SAP NetWeaver 7.3 EhP1 SPS 05 and SAP Note 1780912

SAP NetWeaver 7.4 SPS 05

Procedure

This is custom documentation. For more information, please visit the SAP Help Portal 8
6/12/2019
To set up the data mart scenario, perform the following steps in the target BW system:

1. Create an ODP source system for the source BW system with ODP context BW.

2. In the DataSource tree for the ODP source system, create the DataSource for the InfoProvider of the source BW system.
You have two options:

You can replicate the operational data provider from the source BW system to the target BW system as a
DataSource.

You can create the DataSource.

Related Information
Creating an ODP Source System
Creating DataSources for ODP Source Systems

Extracting Data from the BW System


Concept
Purchasing an open hub license allows you to distribute data to other non-SAP systems using the open hub destination and use
various additional methods to extract data from the BW system.

Various tools suitable for extracting mass data are available:

Open hub destination

with a database table as the destination

with a le as the destination

with SAP Data Services and Third-Party Tools as Destinations

Write data to a le using an analysis process

More information: Data Targets for an Analysis Process

Using the Data Federator: Access to BW data using SAP BusinessObjects tools (Web Intelligence, Explorer, Xcelsius). An
open hub license is not required.

More information: Data Federator

Besides the tools, we also provide you with different interfaces that you can use to program a data extraction operation from
the BW system. These interfaces are not suitable for mass data however:

Various interfaces to connect third-party front-end tools

More information: Open Analysis Interfaces

BI Java SDK

For more information, see “BI Java SDK” in the documentation for SAP Business Warehouse.

Data Mart Interface

More information: Data Mart Interface

Distributing Data from the BW System


This is custom documentation. For more information, please visit the SAP Help Portal 9
6/12/2019

Use
The open hub destination allows you to distribute data from a BW system to non-SAP data marts, analytical applications, and
other applications. It allows you to ensure controlled distribution across several systems.

The open hub destination de nes the target to which the data is relayed.

Any objects supported by the data transfer process can be used as open hub data sources.

The following gure outlines how the open hub destination is integrated into the data ow:

Database tables (in the database for the BW system and SAP HANA DB) and at les can be used as open hub destinations.
You can extract the data from a database to downstream systems using APIs, SAP Data Services or a third-party tool.

Prerequisites
You have already loaded data into an InfoProvider or DataSource.

Procedure
1. Create Open Hub Destinations

Create an open hub destination with the required target.

More information: Creating Open Hub Destinations

2. Create Transformations

This is custom documentation. For more information, please visit the SAP Help Portal 10
6/12/2019
You can create a transformation for your open hub destination automatically or manually. Note that not all rules types are
available in the transformation for an open hub destination: Read master data, time conversion, currency translation and unit
conversion are not available.

More information: Creating Transformations

3. Create Data Transfer Processes

You can create a data transfer process for your open hub destination automatically or manually.

More information: Creating Data Transfer Processes

Creating Open Hub Destinations


Procedure

1. Under Modeling in the Data Warehousing Workbench, select the open hub destination tree.

2. In the context menu for your InfoArea, choose Create Open Hub Destination.

3. Enter a technical name and a description. We recommend using the object that you want to update data to the open hub
destination from as the template. This allows you to create a transformation and a DTP, which are then activated when
the open hub destination is activated.

Note
For technical reasons, an InfoSet cannot be used as a template, although it can be a source.

4. On the Destination tab page, select the required destination. The other settings that you can make on this tab page
depend on which destination you select. More information:

Database Tables as Destinations

Files As Destinations

SAP Data Services and Third-Party Tools as Destinations

5. On the Field De nition tab page, edit the eld list. More information: Field De nition(s).

6. Activate the open hub destination.

Results
You can now use the open hub destination as a target in a data transfer process. BW objects valid for the DTP can be used as a
source.

More information: Creating Data Transfer Processes

Selecting Database Tables as Destinations


You can write data from the BW system to a database table via the open hub destination.

This is custom documentation. For more information, please visit the SAP Help Portal 11
6/12/2019

Context
You can also connect a further database. The data is then also written both to the generated table and to this database. This
allows you to publish data directly from the BW system to other systems. The data does not have to be replicated, and no third-
party tools are required. If you are using SAP HANA, you can also process the data subsequently using SAP HANA functions.

Procedure
1. Choose DB Table as the type of destination. When you activate the open hub destination, the system generates a
database table. The generated database table has the pre x /BIC/OHxxx, xxx being the technical name of the
destination.

2. With an extraction to a database table, you can either retain the history of the data or just store the new data in the
table. If you want to overwrite the elds, choose Delete Data from Table when de ning your destination. The table is
then completely deleted and regenerated before every extraction. We recommend this if you do not want to store the
history of the data in the table. If you do not select this option, the system generates the table just one time, before the
rst extraction. This allows you to obtain the history of the extracted data.

Note

Note that the table is always deleted and regenerated if changes are made to the properties of the database table (by
adding elds for example).

3. You can choose whether to use a technical key or a semantic key.

4. If you set the Technical Key ag, a unique key is added. This consists of the technical elds OHREQUID (open hub request
SID), DATAPAKID (data package ID), and RECORD (sequential number of a data record to be added to the table in a data
package). These elds display the individual key elds for the table.

Using a technical key with a target table is particularly useful if you want to extract data to a table that is not deleted
before extraction. If an extracted record has the same key as an existing record, this duplication causes a short dump.

5. If you set the Semantic Key ag, the system selects all elds in the eld list as semantic keys. You can change this
selection in the eld list. Note however that using a semantic key can result in duplicate records.

The records are not aggregated. Instead each extracted record is saved in the table.

6. If you specify a previously created connection to a database for DB Connection Name, the data (together with the
generated table) is written to this database. You can create or edit the connection to the database in the DBA cockpit.
For more information about the DBA cockpit, go to http://help.sap.com/netweaver SAP NetWeaver Platform
Application Help Function-Oriented View Database Administration <Your Database> .

In the target system (in the database), a table is created with pre x /BIC/.

If the DB instance also has an SAP instance, like an SAP ERP system on a SAP HANA database for example, the DDIC
object also has to be generated manually. The namespace is also registered when this is done. The DB table name in the
remote system is identical to the DDIC name in the local system, or corresponds to the name that would have been
created locally on the database.

Note
You have to make sure here that there is no table with the same name in the target system. Otherwise an error will
occur during creation. The SAP_BASIS version of both SAP systems must also be either identical or compatible, and
the codepage (ASCII, Unicode) should match.

Selecting Files As Destinations


You can write data from the BW system to a le via the open hub destination.

Procedure
1. Choose File as the type of destination. The le formats CSV, ASCII, and XML are supported during extraction to les. A
control le with information about the metadata is also generated.

This is custom documentation. For more information, please visit the SAP Help Portal 12
6/12/2019
2. You can save the le either on the application server or in a local directory. For technical reasons, you can only select le
format XML for les on the application server. If the le is located on the application server, you can schedule extraction
in the background.

Caution
If you save the le locally, the le size must not exceed half a gigabyte. When transferring mass data, you should save
the le on the application server.

The User Account Control (UAC) in Windows 7 and Windows Vista can create the impression that data is written to a
local directory, but no les are actually created. Do not use a directory with restricted authorization.

3. If you want to write the data to an application server from the BW system, you have two options for the le name:

File name: The le name is made up of the technical name of the open hub destination and the extension .CSV,
.TXT or .XML. You cannot change this name.

Logical le name: You can use input help to select a logical le name that you have already de ned in
Customizing. To do this, you create a logical path and assign a logical le name to it. More information: De ning
the Logical Path Name and File Name

A logical le name can be made up of xed path information and variables - such as calendar day and time. Logical
le names can be transported.

4. If you save the le in a local directory, you cannot change the name of the le. It is made up of the technical name of the
open hub destination and the extension .CSV, .TXT or .XML. The associated control le also has the pre x S_.

5. If you select data format XML, you can choose the following formats:

SAP Standard XML ist the default SAP XML format. It is optimized for data exchange between SAP and external
systems in both directions.

SAP Binary Protocol XML is an internal binary SAP variant of asXML. It can only be used in the SAP system
environment. With binary and structure compression however, it provides considerable performance and storage
advantages compared to other XML options.

The XML is generated using the CALL TRANSACTION command:


http://ld9460.wdf.sap.corp:50000/sap/public/bc/abap/docu?sap-language=EN&sap-
client=000&format=STANDARD&object=ABENABAP_XML_TRAFOS&tree=X

Further information about asXML: d031428917434934.xml

6. You can select Default Setting or Direct Entry as the string setting. If you select Default Setting, the system code page is
used. If you select Direct Entry, you can specify your own code page.

SAP Data Services and Third-Party Tools as


Destinations
Concept
You can use the open hub destination to extract data to downstream systems. Various APIs allow you to connect SAP Data
Services and third-party tools of certi ed partners to the BW system and to use this third-party tool to distribute data to other
downstream systems.

SAP Data Services can be connected directly using the destination Third-Party Tools. Data Services supports serial and parallel
extraction. For more information, read the documentation SAP BusinessObjects Data Services Supplement for SAP under
http://help.sap.com Analytics All Products select your language Data Services .

Any objects supported by the data transfer process can be used as data sources. The data is temporarily extracted to a
database table on the BW system. The third-party tool receives a message when the extraction process is complete. Choose

This is custom documentation. For more information, please visit the SAP Help Portal 13
6/12/2019
Parameters to specify the parameters that you want to pass to the third-party tool. You can monitor the extraction process for
reading data by API ("Monitor" function).

You can connect one or more data transfer processes to an open hub destination of type Third-Party Tool.

Using a process chain, you can start the extraction process either in the BW system itself or from the third-party tool.

More Information
For detailed information about certi cation and the scenario BW-OHS, visit the SAP Community Network at
http://scn.sap.com/docs/DOC-7688 .

Extracting Data to a Third-Party Tool


Procedure
You can extract data to a third-party tool as follows:

1. De ne an open hub destination with Third-Party Tool as the destination type.

2. Create an RFC destination for your third-party tool and enter it in the de nition of the open hub destination.

3. Use API RSB_API_OHS_DEST_SETPARAMS to de ne the parameters for the third-party tool that are required for the
extraction.

4. Start extraction immediately or include it in a process chain. You can also start this process chain from the third-party
tool using process chain API RSPC_API_CHAIN_START. The extraction process writes the data to a database table in the
BW system.

5. When the extraction process is nished, the system sends a noti cation to the third-party tool using API
RSB_API_OHS_3RDPARTY_NOTIFY.

6. The extracted data is read by API RSB_API_OHS_DEST_READ_DATA or RSB_API_OHS_DEST_READ_DATA.

7. The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.

APIs for Third-Party Tools as Destinations


API: RSB_API_OHS_DEST_SETPARAMS
You use this API to transfer the parameters of the third-party tool that are required to extract data to the BW system. These
parameters are saved in a parameter table in the BW system in the metadata for the open hub destination.

Parameters Type Description

Import OHDEST RSOHDEST Name of the open hub destination

3RDPARTYSYSTEM LOGSYS Third-party system (logical system)

EXTEND RS_BOOL Extending the parameter

Export RETURN BAPIRET2

Tables PARAMETERS BAPI6107PA Parameter table

This is custom documentation. For more information, please visit the SAP Help Portal 14
6/12/2019

EXTEND: If the value is true, new parameters are added to the existing parameters. If the value is false, the existing
parameters are deleted and the new parameters are inserted. The default setting is false.

The PARAMETERS table contains all the parameters required by the third-party system and saves these parameters in the
metadata of the open hub destination. When extraction is nished, these parameters are sent to the third-party system
(3RDPARTYSYSTEM) using API RSB_API_OHS_DEST_SEND_NOTIFICATION.

API: RSB_API_OHS_3RDPARTY_NOTIFY
This API sends a message to the third-party tool after extraction. It transfers the open hub destination, the request ID, the
name of the database table, the number of extracted data records and the time stamp. You can also add another parameter
table containing parameters that are only relevant for the third-party tool.

Parameters Type Description

Import OHDEST RSOHDEST Name of the open hub destination

REQUESTID RSBREQUIDOUT Request ID

NUMB_OF_PACKETS I Number of packages extracted

DBTABNAME RSBTABNAME Name of DB table

DBRECORDS SYTABIX Number of records extracted

TIMESTAMP BAPIBP_TIMESTAMP Time stamp of extraction (request)

Export RETURN BAPIRET2

Tables PARAMETERS BAPI6107PA Parameter table

Exceptions COMMUNICATION_FAILURE

SYSTEM_FAILURE

API: RSB_API_OHS_REQUEST_SETSTATUS
This API sets the status of extraction to the third-party tool in the monitor. If the status is green, the request is processed
further. Red means that the existing table is not overwritten. The third-party tool sets the status to red when there is a
processing error in the third-party system. This ensures that the existing data is not overwritten until the error has been
corrected.

The status is written to table RSBREQUID3RD.

The following diagram shows how the status entry is processed:

This is custom documentation. For more information, please visit the SAP Help Portal 15
6/12/2019

The OHD state is the status of extraction to the BI system; it is dependent on the status of the third-party tool. This status is
set by the third-party tool.

STOP means that the request has not been started. The BI system waits until the status of the third-party tool is set to green.
Only then does it start extraction.

Parameter Type Description

Import REQUESTID RSBREQUIDOUT Request ID

STATUS RSBSTAT3RD Status of processes in third-party tool; G = Green, R = Red

MESSAGE BAPI_MSG Message for the monitor. This text can contain 220 characters, however only 200 characters
are displayed in the monitor.

Export RETURN BAPIRET2

API: RSB_API_OHS_DEST_GETLIST
This API delivers a list of all open hub destinations.

Parameters Type Description

Import OHDEST RSOHDEST Name of the open hub destination

DESTTYPE RSDESTTYPE Type of destination:

This is custom documentation. For more information, please visit the SAP Help Portal 16
6/12/2019
TAB = DB table

TAB3 = third-party tool

FILE = at le

Export RETURN BAPIRET2

Tables DEST_TAB RSBOHDESTS List of open hub destinations

Table DEST_TAB contains all the destinations that the import parameter accesses.

API: RSB_API_OHS_DEST_GETDETAIL
This API nds the details of an open hub destination.

The target system gets the structure and the table with the metadata of the open hub destination.

This API is used at design time and at runtime.

Parameters Type Description

Import OHDEST RSOHDEST Name of the open hub destination

SKIP_TECKEY RS_BOOL Skip technical key (X)

Export RFCINFOSPOKE RSINFOSPOKE Name of InfoSpoke (not relevant for new open hub destination, and remains
initial)

RFCUPDATEMETHOD RSBUPDMODE Extraction mode of InfoSpoke (not relevant for new open hub destination, and
remains initial)

RFCDATABASETABLENAME RSBTABNAME Name of DB table

RFCPROCESSCHAIN RSPC_CHAIN Process chain; if more than one process chain is available, this parameter
remains empty.

RFCDESTYPE RSDESTTYPE Type of open hub destination

RFCTLOGOSRC RSTLOGOSRC TLOGO type of data source

RETURN BAPIRET2

Tables DBTAB_STRUCTURE BAPI6118DALO Structure of the DB table of the open hub destination

PARAMETERS BAPI6107PA Parameter table of third-party tool

T_MESSAGES BAPIRETTAB Messages

RFCDTPT Data transfer process ID (for new open hub destination)

RFCPROCESSCHAINT Table of process chains (for new open hub destination)

API:
RSB_API_OHS_DEST_READ_DATA_RAW
This API reads data from the database table in the BW system. Unlike API RSB_API_OHS_DEST_READ_DATA, you do not need to
make sure that the same code pages are used. You can specify the code page using the ENCODING parameter. The data

This is custom documentation. For more information, please visit the SAP Help Portal 17
6/12/2019
transfer is binary (in raw format).

Recommendation
We recommend using the new API instead of RSB_API_OHS_DEST_READ_DATA, as the new API can be used with different
code pages.

Parameters:

Parameter Type Description

Import OHDEST RSOHDEST Name of the open hub


destination

REQUESTID RSBREQUIDOUT Request ID

PACKETID I Data package to be read. Is the


value is empty or null, all
packages are read.

SKIP_TECKEY RS_BOOL Skips technical keys in the


target structure (layout
structure)

ENQUEUE_LOOP_COUNT I Counter for lock loop. A loop


waits ve seconds before the
next loop starts.

ENCODING ABAP_ENCOD Identi er for the character


format (UTF-8, UCS-2 etc.) For
more information, see SAP Note
73606.

Export NUMROWS BAPI6116XX-NUMROWS Number of rows

LINES_PER_RECORD BAPI6116XX-NUMROWS Number

NUMB_OF_PACKETS I Number of packages for this


request in the DB table

RETURN BAPIRET2

Tables DATALAYOUT BAPI6118DALO Layout of the data structure

RESULTDATA BAPI6100DARAW Data record (binary) with


continuation indicator

API: RSB_API_OHS_DEST_READ_DATA
This API reads data from the database table in the BW system.

In Unicode systems, you have to use the same code pages if you use this API.

Recommendation
Try to avoid using API RSB_API_OHS_DEST_READ_DATA if possible. We recommend using the new API
RSB_API_OHS_DEST_READ_DATA_RAW instead, as this can be used with different code pages.

This is custom documentation. For more information, please visit the SAP Help Portal 18
6/12/2019
Parameter Type Description

Import OHDEST RSOHDEST Name of the open hub destination

REQUESTID RSBREQUIDOUT Request ID

PACKETID I Data package to be read. Is the value is empty or null, all packages are
read.

SKIP_TECKEY RS_BOOL Skips technical keys in the target structure (layout structure)

I_ENQUEUE_LOOP_COUNT I Counter for lock loop. A loop waits ve seconds before the next loop
starts.

Export NUMROWS BAPI6116XX- Number of rows


NUMROWS

LINES_PER_RECORD BAPI6116XX- Number


NUMROWS

NUMB_OF_PACKETS I Number of packages for this request in the DB table

RETURN BAPIRET2

Tables DATALAYOUT BAPI6118DALO Layout of records

RESULTDATA BAPI6116DA Data records with FOLLOW ag

Field De nition
Use
On the Field De nition tab page you de ne the properties of the elds that you want to transfer.

Features
We recommend that you use a template when creating the open hub destination. The template should be the object from which
you want to update the data. All the elds of the template are then available as elds for the open hub destination. You can edit
the eld list by removing or adding elds. You can also change the properties of these elds.

You have the following options for adding new elds:

You enter eld names and eld properties independently of a template.

You select an InfoObject from the Template InfoObject column. The properties of the InfoObject are transferred into the
row.

You choose . You are offered a list of the elds in the template for the open hub destination that are not contained in
the current eld list. You transfer a eld to the eld list by double-clicking on it. This allows you to transfer elds that had
been deleted back into the eld list.

If you want to de ne the properties of a eld so that they are different from the properties of the template InfoObject, delete
the template InfoObject entry for the relevant eld and change the properties of the eld. If there is a reference to a template
InfoObject, the eld properties are always transferred from this InfoObject.

The le or database table that is generated from the open hub destination is made up of the elds and their properties, and not
the template InfoObjects of the elds.

This is custom documentation. For more information, please visit the SAP Help Portal 19
6/12/2019
If the template for the open hub destination is a DataSource, the eld SOURSYSTEM is automatically added to the eld list with
reference to InfoObject 0SOURSYSTEM. This eld is required if data from heterogeneous source systems is being written to the
same database table. The data transfer process inserts the source system ID that is relevant for the connected DataSource.
You can delete this eld if it is not needed.

If you have selected Database Table as the destination and Semantic Key as the property, the eld list is given an additional
column in which you can de ne the key elds for the semantic key.

In the Format column, you can specify whether you want to transfer the data in the internal or external format. For example, if
you choose External Format here, leading zeros will be removed from a eld that has an ALPHA conversion routine when the
data is written to the le or database table. Note that the user properties of the user who executes the DTP are used when you
write the data in the external format. This has an effect on the decimal and date formats.

Open Hub Service (3.x)


In SAP NetWeaver 7.0 7.0, the Open Hub Service was replaced by the Open Hub Destination. You now have to use the new
concept.

You can nd the documentation for the Open Hub Service at http://help.sap.com/nw70 Application Help SAP Library SAP
NetWeaver SAP NetWeaver by Key Capability Information Integration Business Intelligence Data Warehousing Data
Distribution Open Hub Service .

Note
In earlier releases (before SAP NetWeaver 7.0 7.0) the open hub destination was part of InfoSpoke. It is now an independent
object that provides more options as a result of its integration into the data ow.

Existing InfoSpokes can no longer be used. It is not possible to delete them.

This is custom documentation. For more information, please visit the SAP Help Portal 20

Potrebbero piacerti anche