Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Planning Area
Dear friends,
Hello Vishal,
What is the data load status for in the infocube. Try this- go to your infocube, rightclick
and select manage. Tell me if the request status is green or yellow???
It should be green. Let me know.
Hi vishal,
2. If above is not the problem run the load activity in foreground and analyze the error log
generated. Based on error log statements I might be able to suggest some other suitable
way to achieve the desired results.
Regards
Ankur
Vishal -- Did you ever solve this problem? I am having the same problem. I can see the
data in the infocube (using transaction LISTCUBE) but the data is not populating in the
planning book. There is no error message; the data just does not show up in the planning
book.
All data in the infocube was loaded from Excel. Some data from the same KF in the
infocube does show up in the planning book. This data was entered several months ago.
But now, using the same Excel process, the data that is loaded into the infocube appears
in the infocube, but does not populate the key figure in the planning book in APO.
Hi,
A couple of checks:
1. Is the data being loaded into the planning area in the same time range as the
planning area initialization?
2. Do you have CVCs created for the data being loaded?
3. If these are still not giving you any solution, then try the transaction
"/SAPAPO/TSCUBE". You can manually load the data to Planning area from a
cube.
Hope this helps.
Hi Visu -- Thanks for the suggestions. All good ideas. I've already
checked all these. The time series objects are initialized with the proper
time frame (in fact, I deactivated the planning area and then re-initialized
the time series), all the CVCs are created, and I also tried TSCUBE but it
also was not successful. I also verified that the units of measure match. I
also re-activated the update rules in the infocube, but that did not help. It
seems as if "something" has changed, perhaps in the cube, because other
data previously loaded for the exact same CVC does show up in the
planning book. When I look at the data using LISTCUBE, the data that is
loading into the planning book looks EXACTLY IDENTICAL to the data
that is not loading (only the quantities are different, which is how I can
tell which records are loading and which records are not loading).
For those who may be watching this thread: The problem has been
solved. The problem was that there were several failed
infopackages that had been attempted to be uploaded (via flat file
from Excel) into the infocube. Apparently, these failed
infopackages were somehow interfering with APO's ability to read
any subsequent data from the infocube, i.e., APO could not "see"
any data in the infocube that had been uploaded to the infocube
after the failed infopackages. It seemed similar to the way a
problem in the CIF will block all subsequent traffic in the CIF. In
any case, after deleting the failed info packages from the infocube,
everything was fine and all data that had been loaded after the
failed infopackages was suddenly visible in the planning book in
APO.
we have two planning areas, for DP and SNP it is working for one planning area we are able to
release the data from DP to SNP and it is appearing in the SNP planning book but it is not
working for the second planning area and not appearing in the SNP planning book
we ran all the consistency checks to remove internal inconsistency. The forecast for all the
products are not being populated in the SNP planning book even after a successful release in
/SAPAPO/MC90.
Ashfaq,
Aparna Ranganathan
Ashfaq
Looks like an issue in the category assignment in planning area to me. Go to
/n/sapapo/msdp_admin , find the problematic planning area and check what category /
category group is assigned to your forecast key figure . If your forecast gets released with
category FC to SNP , then that category has to be assigned to forecast key figure in your
SNP planning area. I bet this is the problem , since you are able to see the forecast values
in the other SNP planning book
Thanks
Aparna
Hello,
Please first check whether you can see the forecast orders in product view /sapapo/rrp3.
Please also check the 'Forecas' tab there -- the forecasts may be consumed by other
requirements.
If you can see the forecasts in 'Element' tab, then you can continue to check the planning
area design.
Make sure correct category group is assigned to the key figure you want to show forecast,
also the category FA and FC should be included in the category group.
Please also make sure no macro is affecting the key figure.
Best Regards
Ada
Ashfaq Ali
the issue is that the problem exisits with only 1 particular material and location
combination , however the material is releasing and appearing in the planning
book with other locations.
can anyone point out what can be the issue
Ashfaq Ali
the released forecasts are appearing in the RRP3 t.code under elements tab but are
not visible in the planning book
Ashfaq Ali
the issue is resolved now , there were inconsistencies that were identified in
/sapapo/om17
For a given product, we have APO DP forecasts released to SNP, as standard ATP category FA.
The forecasts released to DC locations (loc type 1002) are shown in the APO product view as
having APO application of 'Planned Independent Requirements', and I can delete these via pgm
/SAPAPO/RLCDELETE.
The forecasts released to factory locations (loc type 1001) are shown in the APO product view as
having APO application of 'Initial' however, and I cannot delete these via pgm
/SAPAPO/RLCDELETE.
My questions are:
- why is the APO application different for the forecasts at the different location types?
- how can I delete the forecasts at the factory location by a batch program?
Thanks
Ramesh
Hi,
I have also sometimes faced issues in deletion of forecasts, though I am also not sure why, and I
didn't dig further into it.
You could try using transaction /SAPAPO/MD74 or report /SAPAPO/RM60RR20 in a batch job
to delete these forecasts. Also, I think deletion should also be possible through the report
/SAPAPO/DELETE_PP_ORDER.
Thanks - Pawan
Hi,
Further to Pawan's reply, if the orders are still not getting deleted by report
/SAPAPO/DELETE_PP_ORDER, then try below.
Regards
Manotosh
Hi,
This can be done by using the report "/SAPAPO/RM60RR20" in a batch job to delete the
forecast
or You can use the transaction /SAPAPO/MD74 to delete them through the
foreground/background jobs
Thanks
Ramesh
How to delete Master Data in APO?
How to rollback /delete master data ( ex. Product master / Material master ) which has
been CIFed from R/3 ?
On Product master when you choose Product -> Mark for Deletion and on the pop up
screen,
If you choose the product and then save; the system sets a deletion flag for the product. if
you choose location and then save; the system sets a deletion flag for the specific location
data for the product.
If you choose the product and then click continue; the system deletes the product.
If you choose location and then click continue; the system deletes the specific location
data for the product.
I think he is asking how to automatically delete the material via CIF, so that if you mark
the material for deletion in R/3 it deletes the material in APO.
I agree. But what he is really asking though; how do you undo a CIF that has happened.
Right? I don't think you can undo that command. If you had your system backed up, you
can restore from the backup and loose the delta changes from the backup time to the
current time. That becomes a question of, "is it worth the loss"?
You are right. Manually deleting these products is the only way. But it would be nice that
products which were marked for deletion in R/3 would no longer be planned in APO.
Using OSS 331664 talks about a work around, but it should be automatic.
Thanks for your help. The tran (om17) is for ' action in KANBAN processing'. Is it the
right tran. ?
OM17 if for livecache consistency check... Can you explain how you can use this to
delete records cif'd from R/3?
You are correct about om17, it is not used to delete data. To delete transactional data in
APO, use txn /SAPAPO/RLCDEL. To delete master data, you have to delete it through
through mass maintenance or manually in the maintenance screen for the master data
element (material, ppm, resource, etc.)
As there is no CVC so i should be able to delete these old products from DP module.This is
happening after applying the hirarchy. or activate masterdata.
If i load any flat file that has incorrect data then before activating i am able to delete.
Hi
1. One alternative, you just try to delete that infoobject, you will get error message as below:
When you right click on an info object, you get "Additional Functions" , there in you have one
option of "Display Logs".
This takes you to SLG1 transaction and shows the detailed error log.
This way you will get list of info cubes or other objects where the in object is being used.
2. Otherwise, please click on the characteristic (info object) and click "where-used" button to
find out all the objects where your info object is used. Then go to particular object and check one
by one.
We too had this issue, and we started deleting a group of products, though it was a time
consulting activity.
Diagnosis
No master data was deleted. Either because it was all being used, or because the user did not
want to delete upon request.
Procedure
First delete the master data in the places, in which it is still being used:
1. In the InfoCube
2. As an attribute
3. In a hierarchy
Then try deleting again.
Thanks a lot
The master data is linked to some nine info cubes.
The error log showed the problem .Now i am doing selective deletion from all the cubes to get
rid of this junk data.
Radhakrishnan
Maintaining the Business System Group
Use
In this IMG activity, you determine the assignment to a business system group of the APO System and
the SAP R/3 System to be connected. By doing this, you create areas with the same naming conventions.
This guarantees that the same names are used for master data, and that they are synchronized in
distributed system landscapes.
Prerequisites
As an independent logical system, the APO System must, in addition to the SAP R/3 Systems, be assigned
to a business system group (BSG).
Procedure
...
1. In Customizing for mySAP SCM – Implementation Guide, choose Inventory Collaboration
Hub Integration of SAP SCM and SAP R/3 Basic Settings for Creating the System
Landscape Maintain Business System Group.
2. Choose New Entries.
The New Entries: Overview screen appears.
3. Specify the following information for the business system group:
An alphanumeric key (maximum of 10 characters)
A description
4. Save your entries.
Example 1 of 2
An APO System is to be linked with two SAP R/3 Systems (A and B), in which two different materials (for
example a hammer and a screw), have the same material number (100). Both materials are to be
represented as two different products in the APO System.
Assign both SAP R/3 Systems to different BSGs. Assign the APO System to one of the BSGs.
SAP R/3 System A (Material number 100 = Hammer) -> BSG A
SAP R/3 System A (Material number 100 = Screw) -> BSG B
APO System -> BSG A
In order to avoid having two identical names and to be able to uniquely assign the material numbers,
you need SAP extension APOCF005. This is an inbound processing product (transaction SMOD or CMOD).
Material number 100 from BSG A receives, for example, product number 100 in the APO System and
material number 100 from BSG B gets product number 100_B. This allows you to uniquely assign the
materials.
For inbound processing in the APO System, the following SAP enhancements are available as customer
exits for master data:
APOCF001 : Inbound processing location
APOCF005 : Inbound processing product
APOCF008 : Inbound processing resource
APOCF0012 : Inbound processing production process model
Example 2 of 2
An APO System is to be linked with two SAP R/3 Systems (A and B). In SAP R/3 System A, a particular
screw has material number 110. In SAP R/3 System B, it has material number 120. Both material
numbers are represented in the APO System by one product, with product number 110.
Assign both SAP R/3 Systems to different BSGs. If possible, assign the APO System to the BSG
whose data does not have to be renamed.
SAP R/3 System A (material number 110 = screw) -> BSG A
SAP R/3 System B (material number 120 = screw) -> BSG B
APO System (product number 110 = screw) -> BSG A
You use SAP enhancement APOCF005 to convert the local SAP R/3 material number 120, to the APO
product number, 110.
For the time being, this second scenario is only supported for material masters and product
masters. If, for example, the same customer is used with different BSGs in R/3 Systems A and B,
you must create two separate customer locations in the APO System. This applies for the vendor,
plant, and other master data.
There is further information in the application component SAP Advanced Planner and Optimizer
(SAP APO) under Master Data General.
Hi Sivaram,
1. As a DP functional consultant you need to resolve all the incidents raised in APO DP module(
Forecasting, lifecycle planning, master data in APO DP, Planning books, macros , release to
APO SNP or R/3 or other systems, CVC maintenance etc) which will also include BW part of
SCM -APO related to DP, like data extraction and back up into and from infocubes. You will
also support all the interfaces realted to APO DP , for the APO related part of the interface.
You need to reoslve these incidents in the time frames defined by SLAs for the support project.
You will have to also carry out problem management of some of the critical incidents so that
incidents do not re-occur.
Further you will have to support Release /Change management support (if it is part of your
contract of Support project).
This will involve testing changes in quality system, before these are moved to production system.
Hi Sivaram,
Below are the points that I would like to share with you
Regards
R. Senthil Mareeswaran.
Hi Senthil,
a) Issue:- Process chain failure
Resolution: - Identify the logs and take corrective action
Why the process chain failure pl mention the same reasons as per your
observations?
Thanking you
sivaram
Hi
The 2 most critical issues in DP support which i have faced as as follows:
2. 2nd critical issue I find is to restrict the pollution of DP environment with non-relevant
master data creation (CVCs). This becomes the major cause of the problem mentioned in
thread above i.e. data mismatch in Planning book and Cubes.
Warm Regards,
Rahul Mohnot
Hi Sivaram,
tickets on aggregation & disaggregation are raised quite often, so you should be aware of
the keyfgiure disaggregation settings.
Many a times data mismatch in cube & planning area is due to cvc not present or cvc not
maintained correctly.
Tickets are raised by user stating statistical forecast is not proper mostly in seasonal
model. This is due to improper sales history.
Regards,
Chetana
support issues DP
hi experts,
can anyone share two support APO DP issues. Any one macros or realignment or life cycle
planning or process chains.
thank you for you valuable time to spending on this. please answer asap.
regards,
Hi Sai,
Here is the details of the second point mentioned by me : Data mismatch between two planning
books for same key figure
Issue: The user was facing an issue where for one key figure he was seeing different values in
two planning books.
Analysis: During the Analysis we found that the Key figure is a calculated Key figure based on a
macro. In one planning book the Key figure was a default macro whereas in the other it was a
directly executable macro or no macro existed for that Key figure. For eg: Consider PB1 and KF
3 which is calculated through a macro as KF1+KF2. The same Key figure KF3 is part of
another planning book PB2. Now when the user had updated the value for KF1 and KF 2 in
planning book PB1 and press enters he would be able to see the updated value for KF3 in PB1,
but if he goes to PB2 he will not be able to see the updated values for KF3 unless he saves the
planning book PB1 after the changes. The issue mainly happened here as the values were not
directly entered by the client for KF1 and KF2 in planning books but were uploaded through a
cube to planning area
Reason: The value doesn’t get updated to the live cache if the values are not saved. Live cache
values are the ones which get displayed in the planning books.
Solution : We had first provided the training to the client and have told them regarding the
different macro types and also made the macro as part of the background job so that every day
whatever changes has happened gets saved to the live cache and values gets updated in all the
planning books.
Thanks,
Diana
Hi Sainath,
Rgds
Sourabh
dear sourabh,
thanks for your reply.
my question is
can you share any two APO DP support project diffcult issues.
how the problem raised and how did you resolved it. from-starting to end.
reply me asap . it would be very helpful for me plz
thanks.
ch.sainath
Dear Sai
Issues vary as per the business requirements and as per the configurations implemented in
a particular project. Couple of simple issues which were faced by our end user is as
below
> Area executive was unable to put estimates in the planning book. This was due to the
authorization not available for that particular Selection ID. Then action was taken
accordingly by providing the authorization.
> Aggregation / Disaggregation results are not as expected. Then the data is thoroughly
analyzed and will be found out due to which reasons these things happen and will be
corrected manually.
> Other issue are also like background jobs getting cancelled / errors in background jobs.
These can be solved by analyzing the logs generated.
Regards
Raghavendra Emani
dear ragh,
thanks for your reply.
Question:
1. Why aggregation results not updated in planning book and what is the root
cause for this. How did you analyze it? What you have done to prevent this in
future...
Hi Sainath,
APO DP issues can vary based on the client structure and implementation done.
1. Data mismatch between cube and the planning area. This may be due to Realignments
or due to data not flowing from R/3 to APO or due to Process chain not updating the DTP
due to Failures.
2. Macro related issues can be like seeing different values in different planning books.
This may be due to default macro in one planning book and not in another. This is
because the values shown in data in planning book is not saved to live cache.
Thanks,
Diana
as you said 1st point data mismatch between cube and planning area. Could you
plz explain me in detail like taking any example of scenario
Question
1. Why data mismatched from info cube to PA , because of realignment what
you have done in realignment how it effects to the data.
2. Data not flowing from R/3 to APO why where did you find the problem.
1. Why the macro values not saved in live cache, reason for it??? And what
solution did you suggest.
I’m very close to the what I want waiting for you reply. To close the thread.
Regards,
ch. sainath.
Hi Sai,
Here is the details of the second point mentioned by me : Data mismatch between
two planning books for same key figure
Issue: The user was facing an issue wheren for one keyfigure he was seeing a
different values in two planning books.
Reason : The values doesnt get updated to the live cache if the values are not
saved. Live cache values are the ones which gets displayed in the planning books.
Solution : We had first provided the training to the client and have told them
regarding the different macro types and also made the macro as part of the
background job so that every day whatever changes has happened gets saved to
the live cache and values gets updated in all the planning books.
Thanks,
Diana
Thank you for your valuable time spending on this. it will be very help
full. . Have a nice day.
Regards.
ch.,sainath
sap apo dp
Hi
Please tell the biggest issues handled while implementing tha SAP APO DP.Tell mee the
challenges handled during implementation.
Thanks
Prasanna
Hi Prasanna,
Some of the few typical DP implementation issues & the challenges handled are as follows:-
1) Choosing the right forecast method: - The effective way of handling will be to analyze the
past historical data in terms of business requirement and strategically ways and then to arrive at
the suitable method which supports trend, promotion, lifecycle planning
2) Accuracy of Statistical Forecast:- This issue can be handled via effective cleaning of past
historical data which are having one-time orders, obsolete products, products which are hold by
marketing, etc.,
3) Change management: - Getting the users to start forecasting, end user training is a big
challenge which can be done via appointing a Change management person and change control
strategy in place
4) Complexity of forecasting: - This can be avoided by choosing only relevant products to send
and maintain in APO and not all products to be planned in Apo
5) Complexity in configuring:- This can be handled by choosing SAP standard objects like
planning area, planning book, profiles etc., which will save time, efforts, resources, easy
implementation, avoid later-on issues etc.,
Regards
R. Senthil Mareeswaran.
E. Final solution!!
Some more (actually lot more…) analysis and debugging took us back to BAdi
/SAPAPO/SDP_MASTER but this time method COMBI_ENRICH. Documentation of this BAdi
method suggests that this can be use to enhance CVC values just before creation. So this is the
place where we can modify CVC values as required. This method also gets called in all the above
ways of creating CVCs. Thus we decided to use BAdi /SAPAPO/SDP_MASTER~COMBI_ENRICH.
Technical pseudo logic is discussed in section below and gives clear picture on how we used this
BAdi method for our validation scenario.
Hi Pankaj,
Splendid writing. I appreciate the way you presented the problem, explained your
analysis, and concluding with a solution providing sufficient rationale and evidence.
Cheers,
Rajesh
Hi Kishore,
Is it a first time deployment or enhancement of existing application?
Anyway, prepare a solid cut over plan and a failure mode and effects analysis (FMEA) kind of
report highlighting integration touch points and possible failures.
Few factors you should consider for Post Go-Live hyper care are:
1. If the DP planning area already exists in production and you are moving the PA related
changes, be sure to take backup of your planning area and deactivate it before releasing the TR.
Otherwise, TR transfer will be failed.
2. There might be security issues i.e. actual user ids have not been updated or mapped with
appropriate profiles.
3. With respect to macros, after go live, better to verify the activation status. Sometimes, they
might be deactivated and requires manual activation (quite rare though).
4. Verify the period population of time bucket profiles. They should have been updated
appropriately.
5. Create time series objects for sufficient period for planning area
6. Execute consistency checks (t.code /SAPAPO/TSCONS) to make sure your planning book
display the data without any error.
7. You have to create DP master data i.e. CVC before populating your planning area with data
and validate the application log. There should not be any red flag.
8. Validate your data transfer - verify the data transfer logs and data itself in the planning book
9. There might be an issue with scheduled jobs. Verify the batch job definitions, existence of the
required forecast profiles, job variants, and selection profiles created in production as per your
checklist without any typo and authorization issue.
10. DP execution - Results may not be as expected - verify the macros and procedure. Check the
user who executed has understood the process and had finished required training.
11. DP output release - Check the logs and resolve any issues reported in the log
Thanks,
Rajesh
I have an issue with Displaying BUOM of Products aggregated level at Interactive planning in
SCM APO DP 7.0 Ehp2
We have PC as a BUOM at Planning area and KG as BUOM at MAT1 for all Products.
So, when we load single Product at interactive planning BUOM reading / displaying correctly as
KG… but when you load bunch of Products to interactive planning which have same BUOM
(KG) reading /displaying the Planning area BUOM (“PC” )
Thanks in Advance.!!
Ravi Kumar
Helpful Answer
That is standard functionality as what if one product does not have UOM KG, then what?
Are you able to manually change the UOM in the PB afterward aggregating? I haven't
tried it but may work.
Hello
Coming to your point..if Product have a different BUOM (CON) then it should
display CON as BUOM at interactive planning which is working fine.
Please let me know.is it clarified your query..And please let me know if you have
solution for my requirement
Thanks
Ravi Kumar )
Hi Ravi,
Thanks,
Rajesh
Posts Tagged ‘APO alerts’
However there are many problems that are created by the Integrators/Consultants including our
“esteemed” blog writer due to their lack of knowledge of both demand forecasting as well as an
understanding of the tool. If they don’t understand the tool, they should not be implementing it
or training the users. The baggage they leave behind creates a mess that makes the software
worthless and unusable. These problems include:
1. Implement APO to be used as a typing tool and tell the users Statistics are terrible so not
worth using it.
2. Disable statistical modeling under the pretext of security. In reality, the consultants are
worried about fielding questions from the users on statistics that they do not understand
themselves.
3. Enabling options and parameters in the tool with out any idea of what they do to the resulting
forecasts.
4. Deciding on important things such as forecast aggregations, forecasting levels and exception
management without any process discovery or user input.
5. Finally not project planning the budget to include training to the users particularly on the Stats
and Demand Planning.
Perhaps companies should divide the implementation plan into two parts – process design that
includes the provision for model tuning and training and the second part that involves system
design and making the system to work. The system design should follow the process design.
The model tuning should be done by the same consultant that is responsible for process design.
And definitely that person should be an Expert not only in the tool but also in best practice
demand planning along with expert skills in training the planners.
Given how many implementations are currently plagued with tool problems aggravated by
Consultant inefficiency and incompetence, perhaps more companies should think about re-
implementations of APO DP. Just throw away the old concepts and practices and start thinking
about how to fix the problems and make the tool more usable.
Finally SAP also needs to wake up and start fixing the problems in their most popular SCM
module namely APO Demand Planning. It needs to fix the error calculations and the alert logic.
More on that in a separate blog entry.
To set the record straight with this “esteemed” writer so he does not pollute the waters and
mislead many planners, I would say the following:
The blog writer concludes that SAP APO calculates the MAPE incorrectly. I agree with him to
a certain extent. SAP purports to compute MAPE using the academic definition of averaging
percentages but it does not do this either. It goes into a hole when the actual demand is zero and
makes the MAPE metric unusable. However, the other metrics namely MAD and RMSE are
correct.
I strongly disagree with this Consultant/writer when he concludes that best fit models are
erroneous because the error calculations are defective. If you poke around the underlying
mechanics which is well documented in the APO online manual, you will know that the
optimization is done using the Mean Absolute Deviation, which is superior to the MAPE; Mape
is a percentage and has some awkward properties.
The Automodel selection 1 uses the MAD and picks only smoothing models while the
Automodel selection 2 also claims to include linear regression models. But in practice
Automodel selection 2 produces inferior results and expects the user to baby sit the modeling by
feeding manual parameters!! In general Automodel is not for the faint of heart as many settings
have to be correctly configured. Given the fact most configurations are done by junior
consultants from big 5 consulting firms, you can guarantee that this is an unrealistic assumption.
Even with other forecasting packages we do not recommend best fit models or expert selection
as the final model. They are good starting points, but the planner has to do more in getting to the
right model and the demand forecast. They don’t have to be expert statisticians but they need to
understand their business and have a preliminary understanding of what various models do.
Yes the Statistical models are straight forward in APO, in fact, they are basic. There is no
complexity in them. They are not claiming to do Box-Jenkins or Transfer functions or ARMAX
or any other models with esoteric names. However, I have found people use the MLR models
very cleverly combined with forecast attributes. So it is all in implementation, model tuning and
finally imparting that much elusive knowledge to the planners and find a way to sustain that
knowledge.
My two cents = Do what you can and understand what you cannot. Some intellectual honesty
will also go a long way!
Happy Labor Day weekend!
Tags: APO alerts, APO DP, Automatic models in APO, Automodel 1, demand metrics, Forecast
modeling, Forecasting Alerts, SAP APO
Posted in Demand Planning, Forecast Modeling, forecasting software, Sales Forecasting, SAP
APO | Comments Off
Is Statistical Modeling an After thought?
Wednesday, March 21st, 2012
I just had a conversation with a Fortune 500 executive recently. He mentioned his company is
spending tens of millions of dollars currently upgrading from SAP APO 4.0 to SAP SCM 7.0
Demand Planning.
Come to think of it, what are the big differences between the 4.0 or 5.0 vs. 7.0? There are some
marginal improvements that the tech shop may admire but anything for the planning
community?!
Then we also hear that the planners have not been using the Statistical modeling feature in APO.
Will upgrading to 7.0 persuade the planners to use the Stat Models more? Not just more, just
even barely? Then I hear a pause and the IT consultant says that Stat models are not a priority
given the budget constraints they have.
So more millions before and no stat models. Now five years later, we have a shiny new upgrade
and again the Stats are not a priority.
I have been preaching Usability for the past few years.
Put together fine tools – But help the users in making the transition to the tool – give them
better understanding – Make the new tool more usable!
Give them the reports they need. Provide them an exception based workflow!
APO has good statistical models. They will help you move the peanut forward but only if they
are understood and leveraged.
We just re-launched the marketing campaign for our Usability Consulting. Model tuning
and model matching to product profiles are important elements of the Usability training.
Once implemented the Usability project will harmonize the use of models across planners from
various geographies for the same business/product family. There will be streamlined work flow.
We help you answer the following questions:
1. Am I using a Pareto Approach in my APO planning process?
2. How can I leverage APO DP to improve our forecast accuracy?
3. Why does APO mostly give me flat forecasts? How do I fix this?
4. What are Alpha, Beta, Gamma, Sigma and Theta? How do I leverage these parameters?
5. What is the correct level to model so as to improve the overall accuracy at the SKU level?
6. What are weighting profiles? How does it affect my final forecast?
7. How can I control time trend using trend dampening profiles?
8. Are there products and customers that are better left to APO’s automated modeling strategy?
9. Which models to choose for what family of SKUs?
10. What are custom modeling profiles?
11. How is APO helping us simplify and improve the promotional planning process?
12. How do I create Multiple Linear Regression Models in APO?
13. Are we using the system defined error metrics in APO? Why are they different from the classic
MAPE calculations?
14. How do you conduct phase-in/phase-out of products?
15. When should I not use the Croston’s Model?
16. Why am I getting 9,000+ alerts every morning?
i know that process chain is basically a background job but i need to know exactly what is
process chain and why we use it on support project thanks
Hi Ganesh,
Process chains are some sequence of processes that are run in the background after a
particular event
Is triggered. We give start time and end condition for these processes to run.
These process chains help us in scheduling daily background jobs/processes so help us in
scheduling the complex schedules in SCM.
thanks anurag that was a helpful answer but do you have a guide how to setup the process
chain and how to monitor jobs
Thanks
Anurag
ya anurag but that link does not show anything regarding the process chain setup
Hello Ganesh,
The following is the only SAP help link which describe about the process chain
which I could get for your need Hope this helps you a bit
http://help.sap.com/saphelp_scm50/helpdata/en/8f/c08b3baaa59649e10000000a1
1402f/frameset.ht
Process Chain
Definition
A process chain is a sequence of processes that are scheduled to wait in the background for an event.
Some of these processes trigger a separate event that can, in turn, start other processes.
Use
In an operating BI system there are a multitude of processes that occur regularly. If you use process
chains, you can:
Automate the complex schedules in BW with the help of the event-controlled processing,
Visualize the processes by using network graphics, and
Centrally control and monitor the processes.
Fundamental principals of the process chain concept are:
Openness
The abstract meaning of a process as any process with a defined beginning and end enables
openness with regard to the type of process that can be integrated into a process chain. The
principle of openness is applied to the theory behind process chains, in that both user-defined
programs and processes can be implemented. In addition, you can include process chains in other
process chains, known as meta chains. In doing so you are able to integrate process chains from
the system in which the meta chain is found, or from other systems. In this context, we are
talking about local or remote process chains.
Security
Using process chains offers a high amount of process security, which is based on the principals
of background management:
Processes are scheduled before they run and can be monitored with the standard batch
monitor.
See also: Process Chain Log Display
Background events start subsequent processes.
Short dumps and terminations are recognized and handled respectively.
Flexibility
The subsequent process must get all the information it needs for a correct run from its
predecessors. This allows new process types to be integrated without the existing types having to
be adjusted.
Structure
A process chain consists of a start process, individual application processes and the collection processes.
Define the start of your process chain with the start process. All other chain processes are scheduled to
wait for an event.
The application processes are the actual processes. BI supports process types of the following
categories:
General services
Load process and post processing processes
Data target administration processes,
Reporting Agent processes
Other BI processes,
as well as processes that you have implemented.
If they are used in other SAP applications, you have further categories available, if applicable.
Collection processes are treated differently by the process chain management. They allow multiple
chain strings to be combined to form one individual string. This allows them to replace multi-field
scheduling of the actual work processes.
Processes are connected using events that start a successor process after being triggered by a
predecessor process.
Integration
A process chain is a BI object with a transport connection and a connection to the BI document
management.
Automatisms
If you use process chains, the automatisms of the integrated processes (for example, update PSA data in
the data target, or activate data in the DataStore object) are ignored and you must implement them
using the process chain. If you schedule a specific process in a chain, you support the automatic
insertion of additional, relevant standard processes with the consideration of such automatisms.
If you use data transfer processes, the automatisms from InfoPackages are no longer available and you
must implement them using process types.
I would like to know the pros and cons for triggering background jobs in APO systems using a
third party software. The project which I am working on has numerous jobs in DP/SNP/PPDS
and gATP.
Warm regds,
Ashutosh,
In a pure SAP landscape I don't vote for a third party scheduler as probably the Return on
Investment is not justified. In a complex multisystem, multiERP landscape this could be a
good choice.
Refer to:
This would give you a good insight. My personal experience says that Cronacle is a good
choice.
Hi Ashutosh,
Using 3rd party job scheduler has its own advantages and disadvantages
From my experience in my previous engagement.
Advantages:-
Disadvantages:-
Regards,
R. Senthil Mareeswaran.
Hi,
I had used Maestro in couple of the projects.
Major pros and cons are like covered in above replies. One major advantage was
cetralized monitoring across many SAP systems.
Now I do not advise any third party tool, because with SAP Solution Manager
Enterprise edition, RunSAP methodology, End to end solution Operations are
being taken care which includes Job monitoring, busienss process monitoring etc
across SAP and non SAP systems landscapes.
I need more illustration when you using Transaction Code /SAPAPO/MC90 to release demand to
SNP
Regards
Rami
Rami
There are two horizons in the product master - SNP production horizon and SNP stock
transfer horizon. In your company if you are using both SNP and PP/DS then after the
forecast is released to SNP, you will execute SNP and then for the short term you will
execute PP/DS.
In such a scenario (where SNP is used for mid term planning and pp/ds for the short
term) , you need to maintain the SNP production horizon (in case of inhouse production)
and stock trans horizon(for external procurement / branch to branch transfers) to define
the boundary between SNP and PP/Ds. that is you need to specify till what day /week
/month you need PP/DDS plan and for what time frame you need SNP plan. The SNP
production / stk trans horizon is the horizon during which SNP will not generate any plan.
That is if your stk trans horizon is say 10 days , then SNP will not genearate any supply
within this 10 days even if there is demand . Supply will be generated for the next 10
days only by PP/DS. SNP will generate supply only outside the production /stock
transfer horizon.
Thanks
Saradha
Hi
With Transaction Code /SAPAPO/MC90 to release demand to SNP, you can release the
FCST from from APO DP to SNP for planning. You can execute the plan either SNP
heuristic , CTM or optimizer.
As out put of SNP planning can be use as a basis for making sourcing, deployment, and
transportation decisions. So this is rough cut mid-term or long term capacity planning. In
APO , we can also do the rough cut capacity planning The actual production is then
planned in PP/DS.
Amol
dear
Many people have explained technically. Practically see the example below
Our forecast horizon is 120 days. i.e. When I plan in Jan 2011, Feb, march, April, May
forecast will be entered by sales in DP
The SNP horizon) asdata will be transfer to SNP (120 days data independent
requirement and production and STO qty is planned in SNP.
Then for Feb data (production planning horizon) is transfer to PPDS for further
processing for finite scheduling to get the correct capacity evaluation and exact dates of
production start and end.
Hope this will help to understand the issue from practical point of view,
Dear Raj,
there is no need of the any intergration model to transfer forcast as PIR to the ECC.
There is standard trx code for this.
/SAPAPO/MC90 - Release Demand Planning to Supply Network Planning
only thing imp in this transfer is, whatever the CVC's you are having in APO. Like product,
location needs to be in the R/3 system.
thats all.
hi vishal,
you need to create integration model for the product location combinations you want to
release the forecast for. this u do in r/3 with cfm1 and cfm2 t-codes let me know if your
question is answered.
regards,
kiran
forgot to tell vishal its thru CIF only u transfer demand from dp to r/3
Dear Kiran,
You are doing some mistake.
he is asking to transfer the forecast from APO to R/3 which cannot be handled thr
integration model.
There is separate transaction code for this.
You can transfer PIR from R/3 to APO with the help of integration model
pravin
Interestingly as kiran mentioned, you need a CIF for the product locations.
Though this is not mentioned you might need to set this up to make a link
between APO and R/3 (i believe APO checks the integration model to validate the
location products for transfer of PIRs)
definetly APO will validate the product and location, as should be a master data in R/3.
ofcurse u can not send the PIR for the material which is not in R/3 (though this can be a
situation, since only CVC is the master data in DP, it wont have any check with the r/3 weather
these exit that kind of releationship or not)
Regards,
Thanx all,
but is it means , Transfer of Forecast only through SNP Planning Area.
Again shld I have Product Master maintained for same ?
Hi Vishal ... pls find the process for releasing forecast from APO to R/3 as PIRs
<u><b>Pre-Requisites</b></u>:
• Product Masters are defined
• Location Masters are defined
• Distribution Types are defined.
(mySAP SCM - Implementation Guide -> Integration with SAP Components-> Integration of
SAP SCM and SAP R/3 -> Basic Settings for Data Transfer -> Publication -> Maintain
Distribution Definition or Generate and Delete Distribution Definition)
Please note:
Integration Models -> used for transfering data from R/3 to APO
Publication / Distribution -> used for transfering data from APO to any other logical systems.
btw also check the APO and R3 queues for any transactions that are stuck
The InfoObject name of the characteristic that represents products in the planning area. (If you
leave this field blank, the system reads the data for the characteristic 9AMATNR and
9ALOCNO )
I had faced a similar issue sometime back. i remember setting it right by setting a CIF model. I
suppose you can try it out for one Location Product (or whatever characterisitc equivalent in DP
you have) and see if it works. but if others have used it without a model then maybe it is so.
Hello Vishal -
Prerequisites
· You have installed the correct Plug-In version.
· You have set up your products in SAP APO by choosing Navigation ® Master Data
® Product from the SAP APO menu tree.
· You have set up your locations in SAP APO by choosing Navigation ® Master
Data ® Location from the SAP APO menu tree.
· You have defined the split of product demand to locations in one of two possible
ways. For more information, see Location Split.
· If necessary, you have defined a product split in SAP APO. For more information,
see Product Split.
ssss· You have created a data view for this task in SAP APO. The use of a separate
data view to transfer a demand plan to SAP R/3 has performance benefits. The data view
contains the following:
¡ A future planning horizon only (no historical horizon unless you want to transfer
the historical horizon too)
¡ A planning buckets profile with one periodicity only (if it contains more than one
periodicity, the job is aborted with an error)
¡ Only the key figure(s) that you want to transfer to SAP R/3
· You have set the planning strategy of each material in SAP R/3.
Regards,
Suresh Garg
Dear All,
I am getting very much confused with this thread.
I never worked on DP, from last 9 months i am working on DP. and last month i
got APO certified.
As per my bookish knowledge and some practicle, i dont think there is any need
to creat product or location in APO. and no need of the CIF model also.
if yes, i think i need to try everthing in system, i should not beleive on the books.
regards!!
pravin mukkawar
Learn a 15-step methodology for executing forecasting projects in SAP Advanced Planning and
Optimization. Understand the most common methods of statistical analysis. Learn best practices for
implementing these methods in practice.
Key Concept
Forecast strategies are used in SAP Advanced Planning and Optimization to decide how forecast values
are calculated. Selecting a method depends on characteristics of the data, including seasonality, linear
trends, and which data is emphasized.
Statistical forecasting is a strong feature of the Advanced Planning & Optimization (APO)
Demand Planning (DP) suite and a lot of companies look at this capability of APO for an
effective demand planning process. The recent version of APO (SCM 7.0) covers a wide range
of statistical forecasting models. However, mere availability of models does not ensure the best
forecast result unless they are used effectively. The first few questions that probably come to
mind for any company looking for such a tool are:
What are the best practices for using the APO statistical forecasting tool?
How do I know which model best meets the needs of my business (as there are lots of models)?
Based on our experience of executing such statistical forecasting projects for clients from
different industries, we have put together a methodology for executing such projects. The
methodology is broken into 15 logical steps. We also provide a set of tips and tricks for effective
use of this tool and a set of case studies.
Step 1. Finalize the Scope of the Statistical Forecasting Project
In any statistical forecasting project, the common tendency is to do statistical forecasting for
every possible stock keeping unit (SKU) that the organization sells. However, it is important to
finalize the scope of the project for two reasons.
Statistical forecasting does not give the desired result in certain cases
Sometimes being selective gives quicker results
Forecasting does not give the desired result for certain SKUs, including these:
New SKUs for which very little history is available and which do not closely mimic the sales
behavior of existing SKUs (where like modeling cannot be used).
SKUs that the organization wants to discontinue in the next few months.
Purely promotional SKUs that are sold for a very short period during the year, such as Christmas.
Highly seasonal SKUs for which very little history is available. Ideally a statistical forecasting tool
needs at least 24 to 36 months of history of such SKUs to identify seasonality.
SKUs for which there is a permanent capacity constraint (i.e., the organization always sells less
than the original demand of the SKU as it has a constraint in production capacity).
SKUs with highly variable or unpredictable supplier lead time and production lead time.
Variability during replenishment skews the actual demand and makes forecasting unreliable.
From our experience, it is also important to be selective while starting such a project. A quick
ABC analysis of SKUs based on sales volume can be handy here. Identify those SKUs that
contribute 80 percent of sales and put most of the effort of model selection in these SKUs. If by
better statistical forecasting, the forecast accuracy for these SKUs can be improved, it will have a
positive effect on the overall business and can deliver quicker results. While in the long run,
forecasting needs to be extended to all SKUs, it is always better to start with A and B category
SKUs.
And as for the procedure i have de-intialized the planning area and added the new key figure to
planning area. and then to planning books. my purpose is solved.
The problem arise when i transport the same to Quality system. I am not able to see the knew
keyfigure in the planning area in quality. i expected that, even the planning book is active in
TAQ, the keyfigure will be added when it comes with Transport.
Do you think my expectation is wrong? to be frank, when it comes to production we can't de-
intialized the planning area to add a keyfigure, as you know how risk is involved in this process.
Is there any way that we can add keyfigure wigh out de-intializing?
Thanks in advance
ARUN R Y
Hi Arun,
1. When you transport, you have to transport the key figure that you added first to Quality and
then transport the planning area. You have to follow the sequence.
If you have already transported the planning area and did not transport the keyfigure, then send
another transport to Q system with key figure and then re import your planning area transport
and this should solve your problem.
2. Effective SCM 5.0, for adding a new keyfigure, you need not deinitialize the planning area.
You can go to /N/SAPAPO/MSDP_ADMIN right click on your planing area> change key figure
settings and when prompted Yes or no, click YES and there you can add/remove the keyfigures.
Don't forget do a consistency check on planning area after this just to avoid any inconsistencies.
HI Venkat,
1, I have followed the Sequence like first i moved the keyfigure and then planning area.
2. I am able to see the keyfigure in InfoObject catalog in the Planning area Rt hand side in Q
system.
3. And i am working on 4.1 version. the suggestion you made ican't find in my version.
So, if i de-intialize and assign keyfugre in Dev system, is it requires to repeat the same in
Quality also!!!
i expected this during the transport moving itself it will assign to planning area in Quality...
Suggest me how can i move on this now. is the only way agian de-intialize in quality and
assigning the same and repeating the same in Production?
Thanks in advance
Arun R Y
Hi Arun,
Adding key figures without deinitialization of PA is only from SCM 5.0. Since you are in 4.1 you have to
transport the key figure first and then the planning area. That is your only option.
Your QA system is usually closed for changes so even if you deinitialize, you may not be able to
make changes to planning area. It is also not advisable to do that.
Hi arjun,
This is very useful for me, after long period got this.
babu
Hi Arjun After activating Planning Area u can add Extra key figure it is possible in 7.0 version
without desalinization of PA
I am not sure if you can transport a PA if it is already active. I just tried it using v7.02 and I got errors
while transporting it. Once I deactivated the PA the transport worked seamless.
Hi ,
I wantt to add additional keyfigures to the active planning area in DP .We are using SCM 7.0 and
iam aware thet we can do it either by right click on planning area -->Change Keyfigure settings.
Please reply with the possible best options whether to retransport the request or directly add key
figures to planning area.
Regards,
Dhanunjay.
1437 Views
Dhanunjay,
If you have KFs with data existing in Production. I would suggest two options.
1. If you do not want to take a back up of PA, then do not use the transport option, add the KFs
in production
2. If at all you want to use the transport path, then I would suggest you take a Back up and then
move forward.
Also you can think you do not have time constraints. Use this as a good opportunity to get started
with having a successful Back up process in place, since, anytime in future you want to transport
PA related changes, you will be presented with same issues time and again. For future
enhancements too, this will come a useful tool.
Hari Vomkarey
Hi Dhanunjay,
I am also trying the same. When I try to transport the planning area to QA, newly added
keyfigure was not transported into QA system.
Please let me know whether you deactivated the Planing area in QA and moved
through transport request ?
Even in development deactivated the planning area, added the key figure and send to
QA. it didn't works.
It looks we have to deactivate the QA planning area before move the Planning Area from
Dev to QA. For this approach we need to have back up and can proceed.
Now directly added the new key figure in QA and send only the Planning book with
newly added key figure from DEV to QA.
It is working nicely in QA. But still not yet added the new key figure in production. But
as per current trial, everything works fine in QA.
You have mentioned there is difference between DEV/QA and Production Key figures. I
am not sure but if you move the Planning area through transport request it will move the
key figures only in the DEV/QA key figures. So I believe this impact the key figures
which are not in production.
As in SCM 7.0 we have option to add the key figure without deactivating planning area,
it is better directly creating key figure in production. Next week I am going to try in
production. If the threads open at that time I let you know
Regards,
Saravanan V
Hi
If working with SAP SCM 5.0 or higher you don't need to deactivate the Planning Area.
Directly add the new key figures by going to the planning area Extras-Key Figure
Settings
Regards
Dam
Hi Damodhar,
First of all, apologies to jump in to the middle of the thread with a question. It
seemed more appropriate here.
I am working on SCM 5.0. Would there be any reason for this 'Key Figure
Settings' option to be grayed out?
1. I have 3 different SNP PA's and for one of them the 'Key Figure Settings'
option is active and for the other 2 PA's it is grayed out...no specials here...all the
three of them are copies of the same standard SNP PA
2. Even for the PA for which it is active, when i try to add a KF with this option
without deinitializing the PA, it is throwing up a strange error "Planning object
structure ID Z9SNPFCS is invalid". But the POS connected to this PA is
9ASNPBAS...Any ideas why this Z9SNPFCS is coming up?
Hi
Could you please run the consistency check for MPOS and try
Regards
Dam