Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
December 2017
This version of this white paper is for integration of Supply Chain Planning Cloud
Release 17D (or later) with E-business Suite Release 12.1.3
Summary
Oracle Supply Chain Planning Cloud provides E-Business Suite customers with capabilities
in Supply Planning, Demand Management, Sales and Operations Planning, Global Order
Promising.
The predefined integration between Supply Chain Planning Cloud and E-Business Suite
allows E-Business customers to continue to use their current supply chain fulfillment
processes for operations while leveraging the advanced capabilities in supply planning,
forecasting, simulation and analysis offered by Supply Chain Planning Cloud applications.
The tight integration provides an easy way to bring data from E-Business Suite Supply
Chain execution modules to Supply Chain Planning Cloud and ability to send plan
recommendations to the E-Business Suite modules. Together these integrations deliver
rapid deployment capabilities and help customers avoid the need to build custom
integrations that require processing and transformation of data.
Functional Scope and Process Flow
Supported Configurations
The integration supports not only the configuration of integration of one EBS system with
one SCP cloud system as described above, but also multiple EBS systems to multiple SCP
cloud systems to support advanced deployments like multiple business units with different
EBS systems.
Multiple EBS systems can be integrated to a single SCP cloud system to enable a central
planning system. However, in such cases, each plan in SCP cloud would still plan a single
source system only. Currently, there is no support for planning material flows across source
systems. Also to support test and production SCP cloud pods, one or more EBS systems
can be integrated to a multiple SCP cloud systems at the same time as shown below
Not in current scope
Before following these steps, the integration needs to be manually configured as described
in the section “Other Manual Steps to configure the integration” later in this document.
Please ensure these steps are done before starting these process steps.
1. User needs to have 'Advanced Planning Administrator' Or 'Advanced Supply Chain
Planner' responsibility.
Using any of these two responsibilities submit a concurrent program called
'Extract data for Oracle Supply Chain Planning Cloud'.
Specify values for the mandatory parameters and submit the job as shown here:
7. If “Collect All Order Types” is set to No in the Parameters window then the “Include
Order Types” field should be entered with the required Order Types to be
collected. Else, it is not required.
8. Transfer the zip file, downloaded in Step #2 over to UCM server on Supply chain
planning cloud
Login to Supply Chain Planning Cloud instance. From the navigator, navigate to 'File
Import and Export' UI. Go to Actions > Upload > Browse your local file system and
select the zip file to upload with planning import account as shown here:
9. Instance setup on SCP cloud: Create a new instance of type "External". Instance code
must be same as the one used on EBS instance to extract data files.
10. Load extracted data into Supply Chain Planning cloud. Navigate to any Supply Chain
Planning Work Area (Planning Central, Supply Planning, Demand Management, Sales
and Operations Planning or Global Order Promising) and launch 'Load Planning Data
from Flat Files' to collect the data extracted from EBS source into SCP Cloud repository.
Please make sure to use the same "Collection Type" (Net change or Targeted) while
loading data as it was used while extracting from EBS.
11. Sales Representatives and the Sales Organization hierarchy should be loaded manually
into SCP Cloud using the Custom Hierarchy Import Template in order to load History
measures successfully. Please use the Custom Hierarchy .xlsm template to load this
data using similar steps as in Step 10 above.
Alternatively, customers not needing to use Sales Organization hierarchy can edit the
shipment and booking history csv files by removing the values from the columns
SOR_LVL_ID and SOR_LVL_MEMBER_ID (i.e., retain the column but remove only the
values) of the .csv files before loading them.
12. For history collections, the date filter works for Net Change Refresh only. This is not
supported for Target Refresh Collections
13. Currently “Bookings History: Requested Item by Requested Date” and “Shipments
History: Requested Item by Requested Date” measures will not be loaded into SCP
Cloud. So it is recommended to set these parameters to “no” to not extract these
measures into .csv files to help reduce Collections runtime.
14. Collect Internal Sales Order History feature is not supported in this release. Hence leave
this parameter as “No” (default value).
Process Steps for Outbound Integration from SCP Cloud
1. In Supply Plan or Planning Central, run the plan for organizations collected from EBS
source system.
2. Release the supply/demand recommendations from Supply Plan or Demand and Supply
Plan. Release plan action generates data files in .csv format for the plan
recommendations whenever the plan is for organizations from external source system.
Those .csv files will now be available as attachments to the "Release planning
recommendations: Release to External Source Systems" ESS job. Or user can also
download the zip file from 'File Import and Export' UI by selecting the account value as
scm/planningDataLoader/export.
Now download the SCP plan recommendations file "ReleaseToExternal.zip" to your local
desktop or the local file system.
3. To import the data into EBS system, users need to have 'Advanced Planning
Administrator' or 'Advanced Supply Chain Planner' responsibility. Using one of the above
responsibilities and navigate to Collections > Legacy Systems > Collect Flat File Data.
Here upload "ReleaseToExternal.zip" file downloaded in Step 2 above.
This Self service upload submits concurrent program 'Import Supply chain planning
Cloud recommendations' to release recommendations from Supply Chain Planning
Cloud directly to the EBS Purchasing, Manufacturing and Order Management systems.
Other Manual Steps to configure the integration
1. Before the above process steps can be taken, a new instance is required to be manually
created in the EBS environment. This instance would be used for extracting data files. In
the current release of the integraton, the new instance setup needs to be done manually
by running a environment configuration script given
The Integration can be customized to modify the data being extracted in various stages.
A new custom package MSC_APCL_CUSTOM holds all the procedures where custom
code can be embedded.
1. Custom hooks for data extraction
MSC_APCL_CUSTOM.FRAMEWORK_PRE_PROCESS
This procedure gets called at the beginning of the data extraction process. Below
are the input/output parameters of this procedure.
procedure framework_pre_process(
p_errbuf out nocopy varchar2, => Return error message if any
p_retcode out nocopy varchar2, => Return status- 0(success),
1(error), 2(warning)
p_custom_catlog_impl out nocopy number, => Return 1(yes) if custom
catalog is implemented, else return 2(no)
p_request_id in number) => Pass request id of the
concurrent job 'extract data for oracle supply chain planning cloud'
MSC_APCL_CUSTOM.ENTITY_PRE_PROCESS
This procedure is invoked in the beginning of data extraction for the entity
procedure entity_pre_process(
p_errbuf out nocopy varchar2, => Return error message if any
p_retcode out nocopy varchar2, => Return status
p_instance_code in varchar2,
p_entity_name in varchar2,
p_coll_mode in varchar2,
p_org_filter in varchar2,
p_catalog_filter in varchar2,
p_prd in varchar2, => Date in 'yyyy/mm/dd
hh24:mi:ss' format as string
p_prn in varchar2,
p_lrn in varchar2,
p_date_from in varchar2 default null, => Date in 'yyyy/mm/dd' format as
string
p_date_to in varchar2 default null, => date in 'yyyy/mm/dd' format as
string
p_coll_iso in number default 2,
p_coll_all_ord_types in number default 2,
p_incl_ord_types in varchar2 default null)
MSC_APCL_CUSTOM.ENTITY_POST_PROCESS
This procedure is invoked at the end of data extraction for the entity
procedure entity_post_process(
p_errbuf out nocopy varchar2,
p_retcode out nocopy varchar2,
p_instance_code in varchar2,
p_entity_name in varchar2,
p_coll_mode in varchar2,
p_org_filter in varchar2,
p_catalog_filter in varchar2,
p_prd in varchar2, => Date in 'yyyy/mm/dd hh24:mi:ss' format as
string
p_prn in varchar2,
p_lrn in varchar2,
p_date_from in varchar2 default null, => Date will be in
'yyyy/mm/dd' format as string
p_date_to in varchar2 default null, => Date will be in
'yyyy/mm/dd' format as string
p_coll_iso in number default 2,
p_coll_all_ord_types in number default 2,
p_incl_ord_types in varchar2 default null)
MSC_APCL_CUSTOM.EXTEND_FSCP_RELEASE_MSC_DATA
MSC_APCL_CUSTOM.EXTEND_FSCP_RELEASE_ERP_DATA
This custom hook can be used to modify records in PO and WIP interface
tables for the current batch being released to ERP source system
procedure extend_fscp_release_erp_data(
p_errbuf out nocopy varchar2,
p_retcode out nocopy varchar2,
p_po_batch_id in number, => Pass batch_id of the records loaded into
PO interface table
p_wip_batch_id in number) => Pass batch_id of the records loaded into
WIP interface table
Appendix 1 - Configuration_Script.sql
REM +=========================================================================+
REM | Copyright (c) 2013 Oracle Corporation
REM | Redwood Shores, California, USA
REM | All rights reserved.
REM +=========================================================================+
REM | Name
REM | Configuration_Script.sql, version 1
REM |
REM | Issue Detail:
REM | Using this script, we will create the instance of EXTERNAL type.
REM | -> This will be used to extract the data from EBS Instance and
REM | can be used to load data in fusion cloud
REM | -> There is no UI developed in EBS that will create instance of
REM | type EXTERNAL.
REM | Hence, we are providing data script to create the source system.
REM |
REM | Script Type:
REM | Insert
REM |
REM | List of steps/verification before datafix:
REM |
REM |
REM | List of steps/verification after datafix:
REM |
REM | Check the instance record got inserted to MSC_APPS_INSTANCES and
REM | MRP_AP_APPS_INSTANCES_ALL. Check the Orgs are assigned to this
REM | created instance in MSC_INSTANCE_ORGS.
REM |
REM | Version History:
REM | 1. created to insert instance and assign orgs.
REM |
REM +=========================================================================+
CONNECT &&1/&&2;
SET ECHO ON
SET FEEDBACK 1
SET NUMWIDTH 10
SET LINESIZE 80
SET TRIMSPOOL ON
SET TAB OFF
SET PAGESIZE 100
SET VERIFY OFF;
WHENEVER SQLERROR EXIT FAILURE ROLLBACK;
WHENEVER OSERROR EXIT FAILURE ROLLBACK;
DECLARE
p_input_instance varchar2(3);
p_input_validation_org varchar2(3);
p_input_group_code varchar2(30);
p_input_org_code_list varchar2(100);
l_val_org_exists number;
l_org_count number;
l_org_array dbms_utility.lname_array;
l_input_validation_id number;
l_org_id_array dbms_utility.lname_array;
l_instance_id number;
BEGIN
--DBMS_OUTPUT.PUT_LINE('Welcome to EBS-Fusion Cloud Integration...');
--DBMS_OUTPUT.PUT_LINE('Enter the Instance Code :');
p_input_instance := upper('&enter_instance_code');
DBMS_OUTPUT.PUT_LINE('================================');
DBMS_OUTPUT.PUT_LINE('Input Instance Code : ' || p_input_instance);
DBMS_OUTPUT.PUT_LINE('Input Validation Organization Code : ' ||
p_input_validation_org);
DBMS_OUTPUT.PUT_LINE('Input Organization Group Code :
'||p_input_group_code);
DBMS_OUTPUT.PUT_LINE('Input Organization Codes :
'||p_input_org_code_list);
DBMS_OUTPUT.PUT_LINE('Merging to MSC_APPS_INSTANCES...');
MERGE into MSC_APPS_INSTANCES msc using (
select p_input_instance instance_code,
l_input_validation_id validation_org_id,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
on (msc.instance_code = rec.instance_code)
WHEN MATCHED THEN
UPDATE SET
msc.validation_org_id = rec.validation_org_id,
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (instance_code,
apps_ver,
instance_type,
dbs_ver,
enable_flag,
apps_lrn,
instance_id,
st_status,
cleansed_flag,
gmt_difference,
last_update_date,
last_updated_by,
creation_date,
created_by,
currency,
allow_atp_flag,
allow_release_flag,
validation_org_id)
VALUES(
rec.instance_code,
-1, --APPS_VER,
3, --INSTANCE_TYPE,
0, --DBS_VER,
1, --ENABLE_FLAG,
0, --APPS_LRN,
l_instance_id,
0, --ST_STATUS,
1, --CLEANSED_FLAG,
0, --GMT_DIFFERENCE,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by,
'USD', --CURRENCY,
2, --ALLOW_ATP_FLAG,
2, --ALLOW_RELEASE_FLAG,
rec.validation_org_id
);
DBMS_OUTPUT.PUT_LINE('Merging to MRP_AP_APPS_INSTANCES_ALL...');
MERGE into MRP_AP_APPS_INSTANCES_ALL msc using (
select p_input_instance instance_code,
l_instance_id instance_id,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
on (msc.instance_code = rec.instance_code)
WHEN MATCHED THEN
UPDATE SET
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (
instance_id,
instance_code,
sn_status,
allow_atp_flag,
allow_release_flag,
last_update_date,
last_updated_by,
creation_date,
created_by
)
VALUES(
rec.instance_id,
rec.instance_code,
-1, --SN_STATUS,
2, --ALLOW_ATP_FLAG,
2, --ALLOW_RELEASE_FLAG,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by
);
DBMS_OUTPUT.PUT_LINE('Merging to MSC_INSTANCE_ORGS...');
FOR i in 1 .. l_org_count
LOOP
MERGE into MSC_INSTANCE_ORGS msc using (
select p_input_group_code org_group,
l_instance_id instance_id,
l_org_id_array(i) organization_id,
1 enabled_flag,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
ON (msc.sr_instance_id = rec.instance_id and
msc.organization_id=rec.organization_id)
WHEN MATCHED THEN
UPDATE SET
msc.org_group = rec.org_group,
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (
sr_instance_id,
organization_id,
org_group,
enabled_flag,
last_update_date,
last_updated_by,
creation_date,
created_by
)
VALUES(
rec.instance_id,
rec.organization_id,
rec.org_group,
rec.enabled_flag,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by
);
END LOOP;
DBMS_OUTPUT.PUT_LINE('End. Please issue COMMIT...');
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||SQLERRM);
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||DBMS_UTILITY.FORMAT_ERROR_STACK);
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||DBMS_UTILITY.FORMAT_ERROR_BACKTRACE);
END;
/
SPOOL OFF;
Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used
Oracle Corporation World Headquarters under license and are trademarks or registered trademarks of SPARC International, Inc. AMD, Opteron, the AMD logo,
500 Oracle Parkway Redwood Shores, CA and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered
trademark of The Open Group. 0114
94065 U.S.A.
Worldwide Inquiries:
Phone: +1.650.506.7000
Fax: +1.650.506.7200
oracle.com