Sei sulla pagina 1di 23

An Oracle White Paper

December 2017

Oracle Supply Chain Planning Cloud


for E-Business Suite Customers
Contents
Overview ............................................................................................................................................... 3
E-Business Suite Requirements............................................................................................................. 3
Summary ............................................................................................................................................... 3
Functional Scope and Process Flow ...................................................................................................... 4
Process Steps for Inbound Integration to SCP Cloud ............................................................................ 5
Process Steps for Outbound Integration from SCP Cloud................................................................... 12
Other Manual Steps to configure the integration .............................................................................. 13
Optional Steps to Customize the Integration Logic ............................................................................ 13
Appendix 1 - Configuration_Script.sql .............................................................................. 17
Appendix 2 – Table MSC_EBS_FUSION_COLL_INTG ................................................................... 21
Overview

This version of this white paper is for integration of Supply Chain Planning Cloud
Release 17D (or later) with E-business Suite Release 12.1.3

E-Business Suite Requirements

 Patch 27082512:R12.MSC.B, Released as General Available

Summary

Oracle Supply Chain Planning Cloud provides E-Business Suite customers with capabilities
in Supply Planning, Demand Management, Sales and Operations Planning, Global Order
Promising.

The predefined integration between Supply Chain Planning Cloud and E-Business Suite
allows E-Business customers to continue to use their current supply chain fulfillment
processes for operations while leveraging the advanced capabilities in supply planning,
forecasting, simulation and analysis offered by Supply Chain Planning Cloud applications.

The tight integration provides an easy way to bring data from E-Business Suite Supply
Chain execution modules to Supply Chain Planning Cloud and ability to send plan
recommendations to the E-Business Suite modules. Together these integrations deliver
rapid deployment capabilities and help customers avoid the need to build custom
integrations that require processing and transformation of data.
Functional Scope and Process Flow

Supported Configurations

The integration supports not only the configuration of integration of one EBS system with
one SCP cloud system as described above, but also multiple EBS systems to multiple SCP
cloud systems to support advanced deployments like multiple business units with different
EBS systems.
Multiple EBS systems can be integrated to a single SCP cloud system to enable a central
planning system. However, in such cases, each plan in SCP cloud would still plan a single
source system only. Currently, there is no support for planning material flows across source
systems. Also to support test and production SCP cloud pods, one or more EBS systems
can be integrated to a multiple SCP cloud systems at the same time as shown below
Not in current scope

This integration currently excludes Process Manufacturing. Hence any inventory


organizations set up as Process Manufacturing organizations in the EBS system will not be
extracted.

Process Steps for Inbound Integration to SCP Cloud

Before following these steps, the integration needs to be manually configured as described
in the section “Other Manual Steps to configure the integration” later in this document.
Please ensure these steps are done before starting these process steps.
1. User needs to have 'Advanced Planning Administrator' Or 'Advanced Supply Chain
Planner' responsibility.
Using any of these two responsibilities submit a concurrent program called
'Extract data for Oracle Supply Chain Planning Cloud'.
Specify values for the mandatory parameters and submit the job as shown here:

2. Locate the extracted zip file on EBS middle tier


File Location (directory on middle tier) :
$MSC_TOP/out/EbsToSCPCloudExtract_<<conc_job_request_id>>/
filename : EbsToSCPCloudExtract_<<conc_job_request_id>>.zip
For example, if the concurrent job 'Extract data for Oracle Supply Chain Planning
Cloud' is submitted with a request ID = 40881431 then,
File Location=$MSC_TOP/out/EbsToSCPCloudExtract_40881431/
Filename=EbsToSCPCloudExtract_40881431.zip
4. Download/FTP the zip file to your desktop or local system
5. Fiscal Calendar is supported in Target Refresh mode only. Please select YES or NO for
Fiscal Calendar in the Concurrent Program Parameters window
6. It is required to set History Date Range Type to Absolute or Rolling. This is mandatory
for Net Change Refresh

7. If “Collect All Order Types” is set to No in the Parameters window then the “Include
Order Types” field should be entered with the required Order Types to be
collected. Else, it is not required.
8. Transfer the zip file, downloaded in Step #2 over to UCM server on Supply chain
planning cloud
Login to Supply Chain Planning Cloud instance. From the navigator, navigate to 'File
Import and Export' UI. Go to Actions > Upload > Browse your local file system and
select the zip file to upload with planning import account as shown here:
9. Instance setup on SCP cloud: Create a new instance of type "External". Instance code
must be same as the one used on EBS instance to extract data files.
10. Load extracted data into Supply Chain Planning cloud. Navigate to any Supply Chain
Planning Work Area (Planning Central, Supply Planning, Demand Management, Sales
and Operations Planning or Global Order Promising) and launch 'Load Planning Data
from Flat Files' to collect the data extracted from EBS source into SCP Cloud repository.

Please make sure to use the same "Collection Type" (Net change or Targeted) while
loading data as it was used while extracting from EBS.

11. Sales Representatives and the Sales Organization hierarchy should be loaded manually
into SCP Cloud using the Custom Hierarchy Import Template in order to load History
measures successfully. Please use the Custom Hierarchy .xlsm template to load this
data using similar steps as in Step 10 above.
Alternatively, customers not needing to use Sales Organization hierarchy can edit the
shipment and booking history csv files by removing the values from the columns
SOR_LVL_ID and SOR_LVL_MEMBER_ID (i.e., retain the column but remove only the
values) of the .csv files before loading them.

12. For history collections, the date filter works for Net Change Refresh only. This is not
supported for Target Refresh Collections
13. Currently “Bookings History: Requested Item by Requested Date” and “Shipments
History: Requested Item by Requested Date” measures will not be loaded into SCP
Cloud. So it is recommended to set these parameters to “no” to not extract these
measures into .csv files to help reduce Collections runtime.

14. Collect Internal Sales Order History feature is not supported in this release. Hence leave
this parameter as “No” (default value).
Process Steps for Outbound Integration from SCP Cloud

1. In Supply Plan or Planning Central, run the plan for organizations collected from EBS
source system.
2. Release the supply/demand recommendations from Supply Plan or Demand and Supply
Plan. Release plan action generates data files in .csv format for the plan
recommendations whenever the plan is for organizations from external source system.
Those .csv files will now be available as attachments to the "Release planning
recommendations: Release to External Source Systems" ESS job. Or user can also
download the zip file from 'File Import and Export' UI by selecting the account value as
scm/planningDataLoader/export.

Now download the SCP plan recommendations file "ReleaseToExternal.zip" to your local
desktop or the local file system.
3. To import the data into EBS system, users need to have 'Advanced Planning
Administrator' or 'Advanced Supply Chain Planner' responsibility. Using one of the above
responsibilities and navigate to Collections > Legacy Systems > Collect Flat File Data.
Here upload "ReleaseToExternal.zip" file downloaded in Step 2 above.

This Self service upload submits concurrent program 'Import Supply chain planning
Cloud recommendations' to release recommendations from Supply Chain Planning
Cloud directly to the EBS Purchasing, Manufacturing and Order Management systems.
Other Manual Steps to configure the integration

1. Before the above process steps can be taken, a new instance is required to be manually
created in the EBS environment. This instance would be used for extracting data files. In
the current release of the integraton, the new instance setup needs to be done manually
by running a environment configuration script given

Run the script as the “apps” user as shown below -

 Validation organization code is an optional input to the script.


 This script can be run multiple times to add organizations and org_groups
incrementally to the instance.

Here is an example Input, Output of the script

SQL> @Configuration_Script.sql apps ****


------------------------------------------------------
Enter value for enter_instance_code: EB1
Enter value for enter_validation_org_code: V1
Enter value for enter_org_group_code: OPS
Enter value for enter_list_of_org_code: D1,M5
================================
Input Instance Code : EB1
Input Validation Organization Code : V1
Input Organization Group Code : OPS
Input Organization Codes : D1,M5
Validating the provided values...
Validating the Instance code...
New instance...
Validating the Validation Organization Code...
Deriving the id of Validation Organization Code..
Validating the provided 2 Organization Code(s)...
Deriving the id of Organization Code : D1..
Deriving the id of Organization Code : M5..
Merging to MSC_APPS_INSTANCES...
Merging to MRP_AP_APPS_INSTANCES_ALL...
Merging to MSC_INSTANCE_ORGS...
End. Please issue COMMIT...

PL/SQL procedure successfully completed.

Optional Steps to Customize the Integration Logic

The Integration can be customized to modify the data being extracted in various stages.
A new custom package MSC_APCL_CUSTOM holds all the procedures where custom
code can be embedded.
1. Custom hooks for data extraction

Customization applicable for all the entities being extracted:

Place this type of customization code inside the procedure called

MSC_APCL_CUSTOM.FRAMEWORK_PRE_PROCESS

This procedure gets called at the beginning of the data extraction process. Below
are the input/output parameters of this procedure.

procedure framework_pre_process(
p_errbuf out nocopy varchar2, => Return error message if any
p_retcode out nocopy varchar2, => Return status- 0(success),
1(error), 2(warning)
p_custom_catlog_impl out nocopy number, => Return 1(yes) if custom
catalog is implemented, else return 2(no)
p_request_id in number) => Pass request id of the
concurrent job 'extract data for oracle supply chain planning cloud'

Entity specific customization:

Place the custom code inside these two procedures below:

MSC_APCL_CUSTOM.ENTITY_PRE_PROCESS

This procedure is invoked in the beginning of data extraction for the entity

procedure entity_pre_process(
p_errbuf out nocopy varchar2, => Return error message if any
p_retcode out nocopy varchar2, => Return status
p_instance_code in varchar2,
p_entity_name in varchar2,
p_coll_mode in varchar2,
p_org_filter in varchar2,
p_catalog_filter in varchar2,
p_prd in varchar2, => Date in 'yyyy/mm/dd
hh24:mi:ss' format as string
p_prn in varchar2,
p_lrn in varchar2,
p_date_from in varchar2 default null, => Date in 'yyyy/mm/dd' format as
string
p_date_to in varchar2 default null, => date in 'yyyy/mm/dd' format as
string
p_coll_iso in number default 2,
p_coll_all_ord_types in number default 2,
p_incl_ord_types in varchar2 default null)

MSC_APCL_CUSTOM.ENTITY_POST_PROCESS

This procedure is invoked at the end of data extraction for the entity
procedure entity_post_process(
p_errbuf out nocopy varchar2,
p_retcode out nocopy varchar2,
p_instance_code in varchar2,
p_entity_name in varchar2,
p_coll_mode in varchar2,
p_org_filter in varchar2,
p_catalog_filter in varchar2,
p_prd in varchar2, => Date in 'yyyy/mm/dd hh24:mi:ss' format as
string
p_prn in varchar2,
p_lrn in varchar2,
p_date_from in varchar2 default null, => Date will be in
'yyyy/mm/dd' format as string
p_date_to in varchar2 default null, => Date will be in
'yyyy/mm/dd' format as string
p_coll_iso in number default 2,
p_coll_all_ord_types in number default 2,
p_incl_ord_types in varchar2 default null)

2. To improve performance of the extract users can set custom_filter,


custom_extract_group and custom_fetch_size for each entity being extracted. To do
this users need to update table MSC_EBS_FUSION_COLL_INTG to set these 3 columns
for each entity.
 custom_filter : This filter would restrict the data being read from the EBS
source view for the given entity.
 custom_fetch_size : This value is used to fetch number of records to be
flushed to the .csv file being generated for the given entity. This can be set to
higher value for the entities that produce large number of records.
 custom_extract_group : Data extraction happens in groups. All the entites are
segregated across 10 groups based on their size to distribute the load across 10
workers. If customer has different data distribution for the entities being extracted
then the detault grouping can be overridden using this column.

See Appendix-2 for a complete definition of this table.

3. Custom hooks for release

Following procedures enable users to customize the import of SCP cloud


recommendations into EBS.

MSC_APCL_CUSTOM.EXTEND_FSCP_RELEASE_MSC_DATA

This custom hook can be used to modify records in


MSC_RELEASE_LINES_INT table for the current batch being released to ERP
source system.
procedure extend_fscp_release_msc_data(
p_errbuf out nocopy varchar2,
p_retcode out nocopy varchar2,
p_batch_id in number) => Pass batch_id of the records loaded into
MSC_RELEASE_LINES_INT through CSV files in current release data load

MSC_APCL_CUSTOM.EXTEND_FSCP_RELEASE_ERP_DATA

This custom hook can be used to modify records in PO and WIP interface
tables for the current batch being released to ERP source system

procedure extend_fscp_release_erp_data(
p_errbuf out nocopy varchar2,
p_retcode out nocopy varchar2,
p_po_batch_id in number, => Pass batch_id of the records loaded into
PO interface table
p_wip_batch_id in number) => Pass batch_id of the records loaded into
WIP interface table
Appendix 1 - Configuration_Script.sql

REM +=========================================================================+
REM | Copyright (c) 2013 Oracle Corporation
REM | Redwood Shores, California, USA
REM | All rights reserved.
REM +=========================================================================+
REM | Name
REM | Configuration_Script.sql, version 1
REM |
REM | Issue Detail:
REM | Using this script, we will create the instance of EXTERNAL type.
REM | -> This will be used to extract the data from EBS Instance and
REM | can be used to load data in fusion cloud
REM | -> There is no UI developed in EBS that will create instance of
REM | type EXTERNAL.
REM | Hence, we are providing data script to create the source system.
REM |
REM | Script Type:
REM | Insert
REM |
REM | List of steps/verification before datafix:
REM |
REM |
REM | List of steps/verification after datafix:
REM |
REM | Check the instance record got inserted to MSC_APPS_INSTANCES and
REM | MRP_AP_APPS_INSTANCES_ALL. Check the Orgs are assigned to this
REM | created instance in MSC_INSTANCE_ORGS.
REM |
REM | Version History:
REM | 1. created to insert instance and assign orgs.
REM |
REM +=========================================================================+

-- Script Starts From Here --

CONNECT &&1/&&2;

SET SERVEROUTPUT ON;


SPOOL Create_Instance.log

SET ECHO ON
SET FEEDBACK 1
SET NUMWIDTH 10
SET LINESIZE 80
SET TRIMSPOOL ON
SET TAB OFF
SET PAGESIZE 100
SET VERIFY OFF;
WHENEVER SQLERROR EXIT FAILURE ROLLBACK;
WHENEVER OSERROR EXIT FAILURE ROLLBACK;

DECLARE
p_input_instance varchar2(3);
p_input_validation_org varchar2(3);
p_input_group_code varchar2(30);
p_input_org_code_list varchar2(100);

l_val_org_exists number;
l_org_count number;
l_org_array dbms_utility.lname_array;
l_input_validation_id number;
l_org_id_array dbms_utility.lname_array;
l_instance_id number;

BEGIN
--DBMS_OUTPUT.PUT_LINE('Welcome to EBS-Fusion Cloud Integration...');
--DBMS_OUTPUT.PUT_LINE('Enter the Instance Code :');
p_input_instance := upper('&enter_instance_code');

--DBMS_OUTPUT.PUT_LINE('Enter the Validation Organization Code :');


p_input_validation_org := upper('&enter_validation_org_code');

--DBMS_OUTPUT.PUT_LINE('Enter the Organization Group Code :');


p_input_group_code := upper('&enter_org_group_code');

--DBMS_OUTPUT.PUT_LINE('Enter the Organization Codes (comma separated.


Eg: M1,M2) :');
p_input_org_code_list := upper('&enter_list_of_org_code');

DBMS_OUTPUT.PUT_LINE('================================');
DBMS_OUTPUT.PUT_LINE('Input Instance Code : ' || p_input_instance);
DBMS_OUTPUT.PUT_LINE('Input Validation Organization Code : ' ||
p_input_validation_org);
DBMS_OUTPUT.PUT_LINE('Input Organization Group Code :
'||p_input_group_code);
DBMS_OUTPUT.PUT_LINE('Input Organization Codes :
'||p_input_org_code_list);

DBMS_OUTPUT.PUT_LINE('Validating the provided values...');

DBMS_OUTPUT.PUT_LINE('Validating the Instance code...');


select count(1) into l_val_org_exists from msc_apps_instances
where instance_code=p_input_instance and rownum=1;
if l_val_org_exists <> 1 then
DBMS_OUTPUT.PUT_LINE('New instance...');
l_instance_id := MSC_APPS_INSTANCES_S.nextval;
else
DBMS_OUTPUT.PUT_LINE('Instance already exists');
select instance_id into l_instance_id from msc_apps_instances where
instance_code=p_input_instance and rownum=1;
end if;

DBMS_OUTPUT.PUT_LINE('Validating the Validation Organization Code...');


if ( p_input_validation_org is not null)
then
select count(1) into l_val_org_exists from mtl_parameters where
organization_code=p_input_validation_org and rownum=1;
if l_val_org_exists <> 1 then
DBMS_OUTPUT.PUT_LINE('Invalid Validation Organization Code..');
DBMS_OUTPUT.PUT_LINE('Exiting..');
return;
else
DBMS_OUTPUT.PUT_LINE('Deriving the id of Validation Organization
Code..');
select ORGANIZATION_ID into l_input_validation_id from
mtl_parameters where organization_code=p_input_validation_org and rownum=1;
end if;
else
DBMS_OUTPUT.PUT_LINE('Input Validation Organization Code is null..');
l_input_validation_id := null;
end if;
dbms_utility.comma_to_table
( list => p_input_org_code_list
, tablen => l_org_count
, tab => l_org_array
);

DBMS_OUTPUT.PUT_LINE('Validating the provided '||l_org_count|| '


Organization Code(s)...');
for i in 1 .. l_org_count
loop
select count(1) into l_val_org_exists from mtl_parameters where
organization_code=l_org_array(i) and rownum=1;
if l_val_org_exists <> 1 then
DBMS_OUTPUT.PUT_LINE('Invalid Organization Code :
'||l_org_array(i));
DBMS_OUTPUT.PUT_LINE('Exiting..');
return;
else
DBMS_OUTPUT.PUT_LINE('Deriving the id of Organization Code :
'||l_org_array(i)||'..');
select ORGANIZATION_ID into l_org_id_array(i) from mtl_parameters
where organization_code=l_org_array(i) and rownum=1;
end if;
end loop;

DBMS_OUTPUT.PUT_LINE('Merging to MSC_APPS_INSTANCES...');
MERGE into MSC_APPS_INSTANCES msc using (
select p_input_instance instance_code,
l_input_validation_id validation_org_id,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
on (msc.instance_code = rec.instance_code)
WHEN MATCHED THEN
UPDATE SET
msc.validation_org_id = rec.validation_org_id,
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (instance_code,
apps_ver,
instance_type,
dbs_ver,
enable_flag,
apps_lrn,
instance_id,
st_status,
cleansed_flag,
gmt_difference,
last_update_date,
last_updated_by,
creation_date,
created_by,
currency,
allow_atp_flag,
allow_release_flag,
validation_org_id)
VALUES(
rec.instance_code,
-1, --APPS_VER,
3, --INSTANCE_TYPE,
0, --DBS_VER,
1, --ENABLE_FLAG,
0, --APPS_LRN,
l_instance_id,
0, --ST_STATUS,
1, --CLEANSED_FLAG,
0, --GMT_DIFFERENCE,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by,
'USD', --CURRENCY,
2, --ALLOW_ATP_FLAG,
2, --ALLOW_RELEASE_FLAG,
rec.validation_org_id
);

DBMS_OUTPUT.PUT_LINE('Merging to MRP_AP_APPS_INSTANCES_ALL...');
MERGE into MRP_AP_APPS_INSTANCES_ALL msc using (
select p_input_instance instance_code,
l_instance_id instance_id,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
on (msc.instance_code = rec.instance_code)
WHEN MATCHED THEN
UPDATE SET
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (
instance_id,
instance_code,
sn_status,
allow_atp_flag,
allow_release_flag,
last_update_date,
last_updated_by,
creation_date,
created_by
)
VALUES(
rec.instance_id,
rec.instance_code,
-1, --SN_STATUS,
2, --ALLOW_ATP_FLAG,
2, --ALLOW_RELEASE_FLAG,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by
);

DBMS_OUTPUT.PUT_LINE('Merging to MSC_INSTANCE_ORGS...');
FOR i in 1 .. l_org_count
LOOP
MERGE into MSC_INSTANCE_ORGS msc using (
select p_input_group_code org_group,
l_instance_id instance_id,
l_org_id_array(i) organization_id,
1 enabled_flag,
sysdate creation_date,
sysdate last_update_date,
-1 created_by, -- get the session user
-1 last_updated_by -- get the session user
from dual) rec
ON (msc.sr_instance_id = rec.instance_id and
msc.organization_id=rec.organization_id)
WHEN MATCHED THEN
UPDATE SET
msc.org_group = rec.org_group,
msc.last_update_date = rec.last_update_date,
msc.last_updated_by = rec.last_updated_by
WHEN NOT MATCHED THEN
INSERT (
sr_instance_id,
organization_id,
org_group,
enabled_flag,
last_update_date,
last_updated_by,
creation_date,
created_by
)
VALUES(
rec.instance_id,
rec.organization_id,
rec.org_group,
rec.enabled_flag,
rec.last_update_date,
rec.last_updated_by,
rec.creation_date,
rec.created_by
);
END LOOP;
DBMS_OUTPUT.PUT_LINE('End. Please issue COMMIT...');
EXCEPTION
WHEN OTHERS THEN
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||SQLERRM);
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||DBMS_UTILITY.FORMAT_ERROR_STACK);
DBMS_OUTPUT.PUT_LINE('Error while executing Create_Instance.sql :
'||DBMS_UTILITY.FORMAT_ERROR_BACKTRACE);
END;
/
SPOOL OFF;

Appendix 2 – Table MSC_EBS_FUSION_COLL_INTG


BUSINESS_OBJECT VARCHAR2(60) NOT NULL ENABLE,
ENTITY VARCHAR2(32) NOT NULL ENABLE,
EXTRACT_GROUP NUMBER NOT NULL ENABLE,
FILTER_TYPE NUMBER NOT NULL ENABLE,
TGT_DATA_SRC VARCHAR2(30) NOT NULL ENABLE,
NC_MERGE_DATA_SRC VARCHAR2(30),
NC_DELETE_DATA_SRC VARCHAR2(30),
NUMBER_OF_RNS NUMBER,
NUMBER_OF_RDS NUMBER,
ENTITY_SNAPSHOT_EBS VARCHAR2(32),
FILE_NAME VARCHAR2(50) NOT NULL ENABLE,
INST_CODE_POS NUMBER NOT NULL ENABLE,
CUSTOM_FILTER VARCHAR2(4000),
CUSTOM_EXTRACT_GROUP NUMBER,
CUSTOM_FETCH_SIZE NUMBER,
CREATED_BY VARCHAR2(64) NOT NULL ENABLE,
CREATION_DATE TIMESTAMP(6) NOT NULL ENABLE,
LAST_UPDATED_BY VARCHAR2(64) NOT NULL ENABLE,
LAST_UPDATE_DATE TIMESTAMP(6) NOT NULL ENABLE,
LAST_UPDATE_LOGIN VARCHAR2(32)
Copyright © 2017, Oracle and/or its affiliates. All rights reserved.
Oracle Supply Chain Planning Cloud
This document is provided for information purposes only, and the contents hereof are subject to change without
for E-Business Suite Customers notice. This document is not warranted to be error-free, nor subject to any other warranties or conditions, whether
expressed orally or implied in law, including implied warranties and conditions of merchantability or fitness for a
December 2017 particular purpose. We specifically disclaim any liability with respect to this document, and no contractual obligations
are formed either directly or indirectly by this document. This document may not be reproduced or transmitted in any
form or by any means, electronic or mechanical, for any purpose, without our prior written permission.
Authors: Vijay Pillarisetti, Anurodh Saxena, Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their
Suresh Naik respective owners.

Intel and Intel Xeon are trademarks or registered trademarks of Intel Corporation. All SPARC trademarks are used
Oracle Corporation World Headquarters under license and are trademarks or registered trademarks of SPARC International, Inc. AMD, Opteron, the AMD logo,
500 Oracle Parkway Redwood Shores, CA and the AMD Opteron logo are trademarks or registered trademarks of Advanced Micro Devices. UNIX is a registered
trademark of The Open Group. 0114
94065 U.S.A.

Worldwide Inquiries:
Phone: +1.650.506.7000
Fax: +1.650.506.7200

oracle.com

Potrebbero piacerti anche