Sei sulla pagina 1di 6

Interview Questions for sap bw and sap bi ShareThis By: leela naveen | 05 Jan 2011 8:58 am This are

questions I faced. If u have any screen shots for any one of the question provide that one also. 1. We have standard info objects given in sap why you created zinfo objects can u tell me the business scenario 2. We have standard info cubes given in sap why you created zinfo cubes can u tell me the business scenario 3. In keyfigure what is meant by cumulative value, non cumulative value change and non cumulative value in and out flow. 4. when u creating infoobject it shows reference and template what is it 5. what is meant by compounding attribute tell me the scenario? 6. I have 3 cubes for that I created multiprovider and I created a report for that but I didnt get data in that report what happen? 7. I have 10 cubes I created multiprovider I want only 1 cube data what u do? 8. what is meant by safety upper limit and safety lower limit in all the deltas tell me one by one for time stamp, calender day and numberic pointer? 9. I have 80 queries which query is taking so much time how can you solve it 10. In compression level all requests are becoming zero which data is compressing tell me detail 11. what is meant by flat aggregate?explain in detail 12. I created process chain 1st day it taking 10 min after that 1st week it taking 1 hour after that next time it taking 1 day with a same loads what happen how can u reduce the time of loading 13. how can u know the cube size? in detail show me u have screen shots 14. where can we find transport return codes 15. I have a report it taking so much time how can I rectify 16. what is offset? Without offset we create queries? 17. I told my process chains nearly 600 are there he asked me how can u monitor I told him I will see in rspcm and bwccms he asked is there any third party tools is there to see? Any tools are there to see tell me what it is 18. how client access the reports 19. I dont have master data it will possible to load transaction data? it is possible is there any other steps to do that one 20. what is structure in reporting?

21. which object based you created extended star schema? 22. what is line item dimension tell me brief 23. what is high cardinality tell me brief 24. process chain is running I have to stop the process for 1 hour after that re runn the process where it is stopped? in multiprovider can I use aggregations 25. what is direct schedule and what is meta chain 26. which patch u used presently? How can I know which patch that one? 27. how can we increase data packet size 28. hierarchies are not there in bi?why 29. remodeling is applied only on info cube? why not dso/ods? 30. In jump queries we can jump any transactions just like rsa1, sm37 etc it is possible or not? 31. why ods activation fail? What types of fails are there? What are the steps to handle 32. I have a process chain is running the infopackage get error dont process the error of that info package and then you can run the dependent variants is it possible? Give me any performance and loading issues or support issues Reporting errors, Loading errors, process chain errors? Please explain in detail or create a screenshots for all the questions and explain in detail If u have any screen shots or text data can u give me the link. Comments 1. Standard infoobjects are SAP defined and infoobjects those are starts with z are user defined and based on the requirement we have to create to infoobjects starts with z. 2. In the same way user defined cubes are starts with z. 3. Cumulative value change means the value of that kf does not change frequently and non-cumulative means value changes frequently. It means,for ex: The stock value in a godown differs from the start of the day to the end of the day. There is lot of inflow of material and outflow of finished goods or products. 4.When we are creating an infoobject it will show reference ch. it means we are refering another ch io here and changes made to that io will be made here also. But in the case of template, here also we are refering another ch io here but any changes made to that io will not be taken here.

By: Murali | 21 Jan 2011 27.Data packet size can maintained in DTP settings.. In Extraction tab By: Syed | 15 Feb 2011 23. When the size of dimenssion table exceed from 20% of fact table size then we made it as High cardinality dimenssion. By: sarat | 05 Apr 2011 22. When your dimension table becomes large (>10% of the fact table) then you can make the dimension a "line-item" dimension.By this only one char can be assigned to the line item dimension . By making the dimension as line item the SID values are directly used and the Dim ID table is not created and thus it provides lesser joins and hence query performance is better. By: sarat | 05 Apr 2011 21.Based on strong entity we are going to create extended star schema By: narendra | 07 Apr 2011 5. Compounding defines superior info object which must be combine to define object. For example when you define cost center then controling area is compounding (superior) object. By: sarat | 07 Apr 2011 24. To stop the process chain for one hour and re ran it from where it was stoped, you have to follow these steps. 1. tcode SM37. 2. Input the job name which is surfix with the day of the week. It is system date and time (GMT). Eg. BW SCHED START-DAILY FRIDAY and Execute. 3. Select the job and go to the menu path job => change. 4. Choose start condition. 5. Give the required time for postponed the process and save it. By: sarat | 19 Apr 2011 31 ODS Activation Failed. 31.1 Why does the error occur? During data load in ODS, It may happen sometimes that the data gets extracted and loaded completely, but then at the time of the ODS activation it may fail giving status 9 error.

Or due to lack of resources, or cause of an existing failed request in the ODS. For Master Data it is fine if we have an existing failed request. This happens as there are Roll back Segment errors in Oracle Database and gives an error ORA00060. When activation of data takes place data is read in Active data table and then either Inserted or Updated. While doing this there are system dead locks and Oracle is unable to extend the extents. 31.2 What happens when this error occur? The exact error message would be like Request REQU_3ZGI6LEA5MSAHIROA4QUTCOP8, data package 000012 incorrect with status 9 in RSODSACTREQ. Some times it may accompany with Communication error (RFC call) occurred error. It is actually due to some system error. The message appears in the Job Overview in RSMO, or in Display Message option of the Process in the PC. The exact error message is ODS Activation Failed. 31.3 What can be the possible actions to be carried out? Whenever such error occurs the data is may or may not be completely loaded. It is only while activation it fails. Hence when we see the details of the job, we can actually see which data package failed during activation. We can once again try to manually Activate the ODS, here do not change the QM status as in Monitor its green but within the Data Target it red. Once the data is activated QM status turns into Green . For successful activation of the failed request, click on the Activate button at the bottom, which will open another window which will only have the request which is/are not activated. Select the request and then check the corresponding options on the bottom. And then Click on Start

This will set a background job for activation of the selected request. Monitor the load for successful completion, and complete the further loads if any in the Process Chain. In case the above does not work out, we check the size of the Data Package specified in the InfoPackage. In InfoPackage -> Scheduler -> DataS. Default Data Transfer. Here we can set the size of the Data Package. Here we need to reduce the maximum size of the data package. So that activation takes place successfully. Once the size of the Data Package is reduced we again re trigger the load and reload the complete data again. Before starting the manual activation, it is very important to check if there was an existing failed Red Request. If so make sure you delete the same before starting the manual activation. This error is encountered at the first place and then rectified as at that point in time system is not able to process the activation process via 4 different Parallel processes. This parameter is set in RSCUSTA2 transaction. Later on the resources are free so the activation completes successfully. By: sarat | 19 Apr 2011 8. Safety Intervals are used in Generic Data Source to make sure that no data records are missed, even if those are not stored in the DB table when the extraction took place. Time stamps got created during extraction and which is in turn used for delta extraction. 1) 0CALDAY => If we setup delta on base of calday we can run delta only once per day that too at the end of the clock to minimize the missing records. 2) Numeric Pointer => This type of delta is suitable only when we are axtracting data from a table which supports only creation of new records, but not change of existing records. Ex: CATSDB(HR time management table)

3) Time Stamp => Using time stamp we can run delta multiple per day but we need to use safety lower limit and safety upper limit with minimum of 5 minutes (300 seconds to be given) By: sarat | 19 Apr 2011

Potrebbero piacerti anche