Sei sulla pagina 1di 2

How to check &PH& How many times have you heard, "Is there anything in &PH&?"? That many.

The following is everything you every wanted to know about &PH&, and then some. When a DataStage job runs, it creates one or more phantom processes on your DataStage server. Generally, one phantom process for the job, and one for each active stage within the job. Each phantom process has its own log file that records information about the process's execution. This information may be useful for debugging problems. "Phantom" is simply the DataStage terminology for "background process" Log files are created in the folder &PH&. A &PH& folder exists in each DataStage project folder. A DataStage job phantom will create a log file in the &PH& folder named with the prefix DSD.RUN_, a DataStage active stage will create a log file in the &PH& folder named with the prefix DSD.StageRun_. All log files end with a time and date suffix. The time is seconds since midnight, and the date in a Universe Julian date. These dates and times are usually close to those found in DataStage Director on the Control event Starting job ... A useful tool is to create a routine in DataStage Manager that will suggest a log file name. The source of this routine is: Ans = "DSD.RUN_":Iconv(TheTime,"MTS"):"_":Iconv(TheDate,"D-YMD[4,2,2]") TheDate and TheTime are the routine arguments. Get the job's run date and time from DataStage Director, the use the test button for this routine in DataStage Manager to compute a suggested log file name. Another useful piece of information is the DataStage job's number. An easy way to find a job number is with DataStage Administrator. In DataStage Administrator, select a project on the Projects tab, then press the Command button. Enter the command: LIST DS_JOBS JOBNO WITH NAME = "your job name" Press the Execute button, and your job's name and number will be displayed. Armed with a job's name, run date, and run time, and a suggested log file name, finding the actual log file is based on the operating system of your DataStage server. The following steps assume a workstation with capabilities similar to Windows NT and My Computer or Explorer. For NT servers, and UNIX servers with SAMBA: 1. 2. 3. 4. Map the folder containing the DataStage project to a drive on your workstation. Open the &PH& folder in the project folder. Change the view to arrange the items in the folder by date. Now do a little pointing and clicking until you find the specific log file. It should begin with the line: DataStage Job your job number Phantom a phantom number For other UNIX servers: To be determined. 1. 2. Telnet to your DataStage server. Change directory to the &PH& folder in the project folder:

cd \&PH\& 3. Using find and grep commands, locate the jobs log files: find . -type f -exec grep -l "DataStage Job #" {}\; Replace the pound sign with the job number. 4. Use the view command to review each file. To exit the view command, enter :q. view path name from the previous step In each project there is an &PH& directory. This is used to write entries by the phantom process and they have this form: DSD.RUN_InternalDate_InternalTime DSD.STAGERUN_ InternalDate_InternalTime This directory can become large and affect the performance. There is no exact number that could cause a problem due to variances in computing power. Generally this should be cleaned as regular maintenance. The more jobs running the quicker it will grow. You can check how many exist with the command: ls |wc -l. There are a couple ways to fix this problem: Log into Administrator-->Projects-->Command and type: CLEAR.FILE &PH& This command can be run with users logged in and jobs running since it looks for existing locks and will skip those files. From $DSHOME: 1. source the dsenv file: . ./dsenv 2. type: ./bin/uvsh 3. type: LOGTO <ProjectName> 4. type: CLEAR.FILE &PH& You can create a shell script to manually delete the files. To ensure there are no locks only delete files that are from finished jobs. You need to make sure the files are older then the longest running job. Generally you can just delete files older then a week.

Potrebbero piacerti anche