Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
TABLE OF CONTENTS
1. How to Utilize an HFM Application as a Source ................................................... 3
2. How to Utilize an Essbase Application as a Target ........................................... 18
3. How to Utilize an Essbase Application as a Source .......................................... 37
4. How to Enable and Configure Map Monitor Reports ......................................... 58
5. How to Install ODI Studio for FDMEE .................................................................. 74
6. How to Change the ODI SUPERVISOR Password .............................................. 77
7. How to Improve Productivity in ODI Studio with 3 Settings Changes ............. 88
1. User Parameters ............................................................................................................................. 88
Step 6: Link ODI Project Variable to FDMEE Source Adapter Parameter (ODI).............................. 115
Step 8: Add the ODI Project Variable to the Appropriate ODI Package (ODI) ................................. 117
Step 9: Modify the ODI Interface to Include Filter (ODI) ................................................................... 118
One of the enhancements to FDMEE 11.1.2.4 is the functionality to utilize a HFM application not only as
a Target but also as a Source. This step-by-step tutorial will demonstrate this functionality, e.g. extracting
data from HFM instead of a file or ERP such as SAP or EBS. Our next post will demonstrate interfacing this
data to an Essbase application as the target.
This tutorial will proceed with version 11.1.2.4 of EPM installed on Windows Server 2012 R2 and with the
COMMA4DIM HFM application.
3. Select the Setup tab and then Target Application that will be the source and target in this example.
EPM applications are not specified as a Source System.
10. Select EPM as the Source, which enables the HFM application to be used as a Source.
11. Update the remainder of the Import Format Details: Name, Description, and Source as COMMA4DIM
and Target Type as EPM. Once this is done, select Save.
19. Select Workflow and then Data Load Rule. If needed, update the FDMEE POV for the Location,
Period, and Category created in the previous steps.
22. A Source Filter is needed for each Dimension Name, which can be selected from the drop down.
Every HFM dimension must be specified. Note: Filter Conditions, which provide data from the
extract, will be specified. The Year and Period dimension will not be specified.
Select Add to specify a filter, and then from the Dimension Name drop down select Account then the
Ellipsis.
25. If a dimension is not specified or an invalid member is specified for a dimension, an error message
similar to the image will be displayed in the log accessible from Process Details. For example, this
error message was generated by an invalid year (Y#200 instead of Y#2007 in the Filter Slice).
33. Once the application is accessible, select Consolidation --> Maintenance --> Task Audit, which will
display the extract occurred as a Flat File Extract.
In the first tutorial, new FDMEE functionality with release 11.1.2.4 was demonstrated. This tutorial will
demonstrate the HFM source data interfaced to an Essbase application and assumes completion of
part one.
This post will proceed with version 11.1.2.4 of EPM installed on Windows Server 2012 R2, the COMMA4DIM
HFM application, and the Sample Basic Essbase application.
1. Log in to Workspace.
5. Select Essbase.
Starting with this step and continuing to STEP 25, the following is from Part One and can be skipped if
Part One was completed.
10. Select EPM as the Source that enables HFM application to be used as a Source.
11. Update the remainder of the Import Format Details: Name, Description, and Source as COMMA4DIM
and Target Type as EPM. Once this is done, select Save.
14. Update the Location Details to COMMA4DIM for Name and Import Format. The remainder of the
Location Details are default and will not be changed in this example. Once the updates have occurred,
select Save.
16. Create the Period Key by selecting Add and then keying the values displayed. The Target Period
Month and Year Target will be utilized as the period and year filter for the data extract; therefore, the
values keyed should be consistent with HFM. Select Save and proceed to the next step.
18. Key BudV1 as both the Category and Target Category with Frequency of Monthly. Select Save once
this has occurred.
22. A Source Filter is needed for each Dimension Name that can be selected from the drop down as
every HFM dimension must be specified. Filter Conditions will be specified which provide data from
the extract. Note that the Year and Period dimension are not specified.
Select Add to specify a filter, and then from the Dimension Name drop down select Account then the
Ellipsis.
25. If a dimension is not specified or an invalid member is specified for a dimension, an error message
similar to the one below will be displayed in the log accessible from Process Details. For example, this
error message was generated by an invalid year, Y#200 instead of Y#2007 in the Filter Slice.
27. Each dimension requires at least one map to specify the import member(s) to the mapped
member(s). In the interest of time, a one line like map will be created in this tutorial.
29. Select the Dimensions drop down and choose the next dimension, which is Measures.
32. Repeat the addition of a map for the Product, Scenario, and Year dimensions, utilizing the same
process as Market and Measures. The maps are displayed for each dimension.
One of the enhancements to FDMEE 11.1.2.4 is the functionality to utilize an Essbase application not
only as a Target but also as a Source. Unfortunately, prior to PSU 11.1.2.4.100, the product had a bug
that prevented this functionality. With the release and installation of PSU 11.1.2.4.100, bug 20747662 Import
from EPM source (Essbase) fails has been resolved. This step-by-step tutorial will demonstrate this
functionality, i.e. extracting data from an Essbase Block Storage BSO application.
This post will proceed with version 11.1.2.4.100 of FDMEE and 11.1.2.4.000 of the other EPM products
mentioned installed on Windows Server 2012 R2 and with the Sample.Basic application.database.
1. Log in to Workspace.
3. Select the Setup tab and then Target Application, which will be the source and target in this
example. EPM applications are not specified as a Source System.
5. Select Essbase.
10. Select EPM as the Source, which enables the Essbase application to be used as a Source.
11. Update the remainder of the Import Format Details: Name, Description, and Source
as Sample.Basic and Target Type as EPM. Once this is done, select Save.
17. Select Application Mapping and then Sample.Basic from the Target Application drop down. Then
select Add.
22. Select Workflow and then Data Load Rule. If needed, update the FDMEE POV for the Location,
Period, and Category created in the previous steps.
25. A Source Filter is needed for each Dimension Name, which can be selected from the drop down as
every Essbase dimension must be specified. Filter Conditions will be specified which provide data
from the extract. Reminder: The Period dimension is not specified.
Select Add to specify a filter. Then, from the Dimension Name drop down select Market and then
the Ellipsis.
Once this is done, select the greater than arrow to add the selection to the Selected Members and
then OK. Note: Select members that have data otherwise zero records will be extracted, which does
not generate an error message.
28. Select Execute. Note: Wildcard maps were previously created for each dimension except for
Scenario, which is an explicit of Actual as the Source and Budget as the Target.
29. When the Execute Rule dialog box displays, select the Import From Source option, which
automatically selects Recalculate, and then select Export to Target. Once this is done, select Run.
34. With a successful completion, log into Essbase Administrative Services to view the log, a sample
of which is captured below.
36. Depending on your browser settings, answer the dialog boxes appropriately to open the file. Based
on my browser settings, I selected Leave this page and then Open.
38. Further in the log, the Sample_171.dat file is imported, and because the ARCHIVE MODE is set to
Move, an archive file is created.
41. Select Add and then key and/or select the following: Script Name ClrAToB, Script Scope Data
Rule, Scope Entity Sample.Basic, and Event Before Data Load.
44. Select RTYear and then Script Value POV Period. Once this is done, repeat the process to set
the RTScenario to POV Category. Once both parameters are specified, select OK.
48. The successful setting of the RUNTIMESUBVARS can also be verified from the Essbase Sample
application log, as displayed in the excerpt from this log.
For many organizations, Map Monitor Reports are a key internal / SOX control. Specifically, the PSU
11.1.2.4.100 Readme states, Two new Map Monitor Reports have been added to the Audit Reports report
group: Map Monitor for Location and Map Monitor for User. Map Monitor Reports show changes made to the
data load mapping rules using any of the methods data load mappings UI, Excel template imports, text file
imports, and Lifecycle Management imports. The Map Monitor Reports do not capture historical data earlier
than release 11.1.2.4.100. Map Monitor Reports are enabled only if the Enable Map Audit is set to 'Yes' in
System Settings.
This blog post will demonstrate how to enable the functionality and detail which map types are logged
monitored based on the type of change and which are not.
7. Update the Source Value to record a change and then select Save.
8. Finally, delete the map by selecting the Source Value and then Delete. Once the Delete Confirmation
occurs, select Yes.
11. Two reports are provided: Map Monitor Report by Location and Map Monitor by User. Select Map
Monitor Report by Location and then Execute
.
12. The Location will default to the POV location. Select the Action Type magnifying glass, which
provides the Select Parameter Value dialog box. Select All and then OK.
16. The Generate Report prompts are the same between reports with a User prompt replacing the
Location prompt, with a notable exception: the User prompt does not have the lookup feature.
Therefore, key the appropriate User, Start Date, and End Date in the date format specified and then
select OK. Once this is done, select OK and when prompted Open the report.
22. After keying a handful of enters to adjust formatting, the entire query syntax is visible. The Select
and Where Clause sections are the typical syntax. Therefore, the analysis will begin with the one of the
two objects aif_artifact_audit_v listed after the FROM. Note: The tpovpartition object is the table
that stores location detail, which is the reason for starting with aif_artifact_audit_v.
24. The view syntax will be displayed and, after a few enters, reformatted for readability. Essentially,
the view is a standard select with conversion of LAST_UPDATE_DATE to a VARCHAR(25) datatype and
aliases as both CHANGE_TIME and CHANGE_DATE. As the view does not have a WHERE clause, our
next step is to review table AIF_ARTIFACT_AUDIT.
26. Essentially, I added a map for each FDMEE map type and then changed the map. The results of the
subsequent Map Monitor By Location is displayed. In summary, the change is recorded for map types
Between, In, and Like, but not Multi Dimension. Table 1 provides further detail.
TABLE 1
Map Type Action What Changed Logged
Explicit Add N/A Y
Explicit Change Source Value N
Explicit Change Target Value N
Explicit Change Change Sign N
Explicit Change Description N
Explicit Delete N/A Y
Between Add N/A Y
Between Change Source Value Y
Between Change Target Value Y
Between Change Change Sign Y
Between Change Description N
Between Change Rule Name Y
Between Delete N/A Y
In Add N/A Y
When creating this blog post, I opened a SR with Oracle and a Bug # was assigned: Bug 22526857 -
AIF_ARTIFACT_AUDIT TABLE NOT LOGGING ANY CHANGES FOR EXPLICIT AND MULTI DIMENSION.
FDMEE has a full Oracle Data Integrator (ODI) backend installed by default. If you enter through Workspace,
you will see that not much has changed with the user interface. However, if you are an ODI developer, the
screens that you are used to seeing in ODI Studio are not available.
ODI has expanded logging and offers generally more granularity than what is available within the FDMEE user
interface alone. To use this granularity, you have to install the ODI Studio client, which can be installed on the
server, and even most laptops. ODI Studio is especially helpful when things dont quite go as expected since it
allows better insight into the processes and how they function (or dont function).
Oracle installs most ODI things now that in the past were not a simple install, such as the ODI agent and the
necessary Windows services, within the Hyperion configuration process. But one thing that they still do not
install by default is ODI Studio.
ODI Studio can be found by going to eDelivery and, after some searching for Hyperion products, you
will find the option to download Oracle Data Integrator Studio (11.1.1.7.0):
Note: Now that you know that the item number V37940-01 (1.5GB) is the ODI Studio that you require, you can
search for that number to minimize your search efforts. You might be able to find the full version of ODI
11.1.1.7 (V37390-01_1of2.zip and V37390-01_2of2.zip and 2.8 GB), but we are not using those files here.
Notice that this is not the full version of ODI Studio. If you try to install anything other than the Studio, you will
get errors. In fact, you will get errors when you use this install, even if you only install the Studio. It appears
that, with this download, Oracle has left out a few CAB files.
The good news is that the CAB files that they left in allow for the install of the Studio and connects nicely to the
FDMEE backend. You will need to login to Workspace and open FDMEE to get the configuration information
used to login to ODI.
To get the configuration information, in FDMEE click the Setup tab, select System Settings in the
Configure section under Tasks. You might even want to select the ODI option from the Profile
Type dropdown menu.
This information is all that you will need to configure your newly installed ODI Studio. Once you start
your ODI Studio console, you will need to click on the Connect To Repository button. This will bring
up our old friend the ODI login screen:
You should be able to click on the button for the Work Repository search and select FDMEE at this
time. This also tests your relational setup as well, if you get this option:
There you have it: the ODI Studio in all of its glory. The agent is already running and you can use the
logging of the Session List.
During the installation and configuration of FDMEE, the ODI master repository and agent are installed with a
default username of SUPERVISOR and SUNOPSIS. Securing the Oracle Hyperion EPM environment by
changing the SUPERVISOR password is not a single step but requires changes in three separate interfaces:
FDMEE, ODI Studio Console, and WebLogic / Oracle Enterprise Manager.
This post will proceed with version 11.1.2.4 of Oracle EPM installed on Windows Server 2012 R2. The steps
for 11.1.2.3.X of EPM are the same except for Operating System differences.
5. Key the new ODI password for User Name SUPERVISOR and then select Save.
6. After a successful Save, the message Your changes have been saved will display.
7. To verify the ODI SUPERVISOR password has changed, select Check ODI Connection," which will
display an Error. Select OK to close the error.
9. Key the User SUPERVISOR and the Password SUNOPSIS, and then select OK.
11. Select Users > Supervisor then right click and choose Open.
17. Select Test, which will generate an error. This verifies the password has changed. Select OK to
close the Error dialog box.
18. Select Start Admin Server for WebLogic Server Domain from Apps By Name.
19. Once the Admin Server has started, select Admin Server Console from Apps By Name.
22. Key the WebLogic User Name and Password created during installation and then select Login.
25. Select EPMSystem > WebLogic Domain > Security > Credentials.
30. Return to ODI Console and select ODI > Disconnect SUPERVISOR.
32. Key the User SUPERVISOR and Password Updated One, and then select OK.
34. Select Agents > OracleDIAgent and then double click OracleDIAgent.
35. Select Test, which will generate an Agent Test Successful dialog box. Select OK to close the dialog
box.
Similar to any client program, ODI Studio has various settings that can be updated to improve productivity. This
post details three settings that I typically adjust.
Assuming a default install and previous configuration of the ODI Studio Client, begin by selecting Start --> All
Programs --> Oracle --> Oracle Data Integrator --> ODI Studio.
1. User Parameters
To change any User Parameter value, execute these steps: double click the value, key the updated value, and
then select Enter. Select OK to exit the Edit User Parameters dialog box. To enable the change in ODI Studio
Client, exit and then open the program.
When building interfaces, the source and target often have column names that are the same. To enable
Automatic Mapping without having to confirm the feature, change the default from Ask to Yes. For the
Automatic Mapping option, key "Yes".
When executing procedures, interfaces, packages, scenarios, etc. from the ODI Studio client, the default
execution Agent selection is "Local (No Agent)," which would execute with the resources of the computer you
are logged into. Typically it is preferred to execute with server resources; therefore, to set the Default Agent,
key the preferred Agent (which in this example is OracleDIAgent). This can be a significant time-saver, whether
your environment has one agent or seven agents.
Alternatively, the limit dialog can be disabled by changing the default from 1 to 0.
2. Conf File
Next a configuration change will be made to increase the memory available to the ODI Client to reduce /
prevent java.lang.OutofMemoryError: Java heap space, which occurs in file ide.conf.
With a typical install, the location of the ide.conf file is install drive and then
"oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin". To find the location from the ODI Studio client, log
in to the client and then select Help --> About --> Properties.
Open this file with your preferred text editor. The IncludeConfFile parameter provides the location of ide.conf,
which is C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\ide\bin\ide.conf. Open ide.conf with your
preferred text editor.
3. Log Files
ODI has several logs in different locations. One of the logs is written to java.io.tmpdir, which is used by multiple
Knowledge Modules. To find the location from the ODI Studio client, log in to the client and then select Help -->
About --> Properties and scroll until java.io.tmpdir is displayed. This location is utilized when "Local (No
Agent)" is used.
If you are using the OracleDIAgent installed with FDMEE, assuming installation on Windows, the location is
defined in the registry. Specifically, HKEY_LOCAL_MACHINE --> SOFTWARE --> Hyperion Solutions -->
ErpIntegrator0 --> HyS9aifWeb_epmsystem1.
One of the numerous enhancements when migrating from FDM Classic to FDMEE is the capability to
schedule a batch from the Web Interface. However, once the batch is scheduled, how does an
individual update or validate the schedule? The interface does not provide this option.
Before I proceed with how to update or validate the schedule, below is a brief tutorial on creating an open
batch definition, which will subsequently be utilized in the scheduling example.
In this example, the Name will be Schedule. The name of a Batch Definition is at the discretion of the
individual creating the Batch Definition; however, I would recommend creating a name that is relevant
to either the FDMEE location or FDMEE data load rule. The Target Application selected is COMMA4DIM,
which is a Hyperion Financial Management (HFM) application previously created as a target, and Type
will be Open Batch, which enables the automation of loading one or more data files. Once the
selections are updated, select Save.
The Batch Execution Summary will display the Batch Definition previously created. Select the Batch
The Schedule options presented are typical of most scheduling software. In this example, the schedule
will be daily at 12:30 AM, which is displayed in the second image after the initial Schedule options.
Once the scheduled is updated, select OK.
With the Batch Name scheduled, the question is now what. From the FDMEE web interface, currently
the options are Cancel Schedule, which will cancel one or more schedules for the Batch Name, or
Check Status, which provides the current status. Neither option provides functionality to change or
validate the schedule. Hopefully, Oracle will provide this feature in a soon to be released PSU or PSE.
Without this feature accessible from FDMEE, the alternative is to utilize ODI Studio.
From the menu bar, select View > ODI Operator Navigator.
Select and expand the following: All Schedules > COMM_EXECUTE_BATCH Version 001
> Scheduling > GLOBAL / OracleDIAgent. Once this is done, select Open. Note:
COMM_EXECUTE_BATCH is the ODI Scenario that is scheduled when the Batch Definition Type is
Open Batch.
As mentioned, COMM_EXECUTE_BATCH is the ODI Scenario that is scheduled for an Open Batch.
What occurs when two or more Batch Definitions are scheduled? In the circumstance of two scheduled
Batch Definitions, two GLOBAL / OracleDIAgent listings are displayed. How does one differentiate
between the two (i.e. which Batch Definition is associated with which listing)?
Select the Variables option, which will display four Session Variables, one of which is the Batch
Definition name.
Another method to verify the schedule which will be used to change the time from 12:30 AM to 2:00 AM
is to select View > ODI Topology Navigator from the menu bar.
Select and expand the following Physical Architecture > Agents -> OracleDIAgent. Once this is done,
select Update Schedule.
The Schedule dialog box will render with a default time frame of two hours from the executed
time. Select the greater than sign to update the schedule two hours at a time.
One of the best practices of interfacing general ledger data to Hyperion Financial Management (HFM) is
the loading of data with a Year To Date (YTD) View instead of Periodic. However, there are
circumstances, such as a partial year implementation of a ledger without history, that prevent this best
practice. At a client of a colleague this recently happened. When presented with this requirement, my
colleague checked the Import Format and Target Application options, which did not have a View option. As a
result, this blog post presents the solution needed to load Periodic data to HFM utilizing FDMEE.
After I received the inquiry from my colleague, I also checked the same options with the theory of two sets of
eyes are better than one. Being unsuccessful, I checked the latest version of the FDMEE Administrator guide
and found the DATAVIEW field in table, TDATASEG. The DATAVIEW is Hard coded to YTD for file," which is
how data was being loaded in this circumstance. Based on this knowledge, a short event script was needed.
Before I display the script, this simple one-row data load displays the business need to load data with a
Periodic View instead of YTD based on the ledger data received.
In this example, two hundred dollars was interfaced to an Income Statement account in June of the current
year. The first image is YTD and the second is Periodic. Both images display two hundred dollars 200, as
expected.
To correct the data load, an event script to change the value in the DATAVIEW field from YTD to Periodic prior
to the export file being created is needed. Time to open Eclipse, my editor of choice for Python, and type
a few lines of code displayed in the next image.
After the implementation of the script as a BefExportToDat event, the interface was executed again. The first
image after this paragraph reflects a corrected YTD value of 450, and the second image reflects a remedied
periodic value of 250.
The various menus, options, and setup screens within FDMEE and ODI can be overwhelming. This is why I
was intimidated when initially challenged with the task of leveraging FDMEE parameters to filter data sets, but
the task turned out to be a great learning experience. Going through the process helped me gain a better
understanding of the relationship between FDMEE and ODI objects.
OBJECTIVE
Provide users with the option of filtering data based on account type. In this particular example, users can
choose to include or exclude Income Statement accounts in their data load.
Accounts (Level 0) are 5 digit numbers
Non-Income Statement Accounts (Balance Sheet) begin with 1,2,3,4
This reference to p_incl_inc_stmnt in the code above is a direct reference to the FDMEE Source Adapter
Parameter created in Step 1 > Parameter Name: p_incl_inc_stmnt
Once all the FDMEE and ODI objects have been successfully set up and saved, the user will then be able to
filter the data set by updating the parameter value on the Data Load Rule menu in FDMEE and clicking on the
Execute button.
This is just one example of how to leverage FDMEE Source Adapter Parameter. There are numerous ways to
utilize this feature.
degrees and certifications, our team of functional and technical experts Houston, Texas
2500 CityWest Blvd.
have helped hundreds of the nations largest and brightest companies
Suite 300
bridge the gap between business goals and IT deliverables. Houston, TX 77042
info@us-analytics.com
To ensure end-to-end coverage of your technology, we provide a 877.828.USAS
services.