Sei sulla pagina 1di 4

In DataStage, how does one state where the database script and database log go to in the Teradata Connector.

* * * * * * * * Technote (troubleshooting) Problem(Abstract) In the Teradata Multiload Stage, there is a way to define a report file and control file. However, in the Teradata Connector Stage, there is no field to do this. Is there a way to do the same thing as what is seen in the Teradata Multiload Stage? Resolving the problem The Teradata connector stage does not have the functionality to produce data files and report files. The way TPT is developed is different from Multi-Load stage. However, there is a way to generate the TPT trace files. Enable the below environment variables: a) CC_MSG_LEVEL=2 b) CC_TERA_DEBUG=4 This will generate the TD_TRACE_OUTPUT.<DBNAME>.<TABLENAME>.<#>.txt files. These files are generated in the ../Server/Projects/<ProjectName> folder. In UNIX flavors, these files may get generated in the user accounts folder (/home/dsadm, for dsadm user). The trace file has information about the flags that are enabled on TPT, error table information, log table information, and insert flags. The reason that TPT does not provide the data files is as follows: a) Using TPT, it is not possible to load a script just like how it is done with MLOAD. b) The interface/application that communicates with TPT interface must have thorough intelligence about the TD_XXX flags. c) The data format must also be taken care such as big-endian OR little-endian, along with size of data file at saving location.

InfoSphere DataStage Parallel Job hangs when updating a table using a multi node configuration * * * * * * * * Technote (troubleshooting) Problem(Abstract) InfoSphere DataStage Parallel job hangs when updating a table and using a multi node confifuration but runs succesfully when using a single node Cause Processes that update tables concurrently like a parallel job with multiple nodes, may run into locking issues when more than one process tries to update the same row. If a lock issue occurs you won't see any error messages in the log, the job will appear to be hung. Resolving the problem To prevent this from happening the job design must ensure that all rows with the same key belong to the same partition. The only partition method that guarantees this kind of distribution is the Hash partition. The Hash method distributes the input data based on the values of the key columns. As result, rows with the same key columns are always in the same partition. To learn more about Hash partition, please refer to the Parallel Job Developer Guide. To make sure that you are not running in a lock issue, make sure that the stages that are updating rows in the job are using hash partition methods. After SQL statement executes when job fails in IBM InfoSphere DataStage * * * * * * * * Technote (troubleshooting)

Problem(Abstract) In an Oracle Connector stage and an ODBC Connector stage, the After SQL statement executes even if there is a failure and the job aborts. Resolving the problem This is intended behavior. Please see a description of the behavior for the Oracle Connector, ODBC Connector, ORAOCI Plugin, and DRS Plugin stages. Oracle Connector The "After SQL (node)" will also run regardless of a fatal error on that node. ODBC Connector This behaves exactly the same as Oracle Connector. ORAOCI Plugin By default, if Warnings are logged in the job, the "After SQL" statement will execute. If you set the property to treat warnings as errors, then the job simply aborts at the first failure and the "After SQL" is not executed. There is no "After SQL (node)" DRS Plugin This behaves exactly the same as ORAOCI Plugin. IBM InfoSphere: DataStage dsjob Command Returns Status code = 80011 * * * * * * * * Technote (troubleshooting) Problem(Abstract) Running dsjob utility with or without -domain option results in an error message: Status Code = 80011 Cause The dsjob utility is unable to authenticate and validate against the DataStage Engine. Resolving the problem Use -domain NONE option.

For example, instead of using <DS_HOME>/bin/dsjob -domain server_name:port -user xxxx -password xxxx -server server_name -lprojects use: <DS_HOME>/bin/dsjob -domain NONE -user xxxx -password xxxx -server server_name -ljobs Also see related information below for online references for dsjob utility In DataStage Designer, the Projects drop down is empty? * * * * * * * * Technote (troubleshooting) Problem(Abstract) When you connect with DataStage Designer and enter a valid username and password then select the Projects drop down. There is no list of projects. Resolving the problem This usually indicates the ASBAgent is not running or has lost connection. Stop and restart the ASBAgent and test again If possible, stop and start all tiers to ensure all tiers are connected.

Potrebbero piacerti anche