Sei sulla pagina 1di 14

Sr.

n Question A B C D E
o
1 which job allow allow multiple enable for enable runtime
properties multiple invocation information column
option is instance services propagation
used to
enable the
ability to run
copies od the
same job?
2 which $APT_GRID_ $APT_GRID_E $APT_GRID_IDE $APT_GRID
environment QUEUE NABLE NDENTIFIER _COMPUTENODE
variable will S
enable job to
run on GRID
3 how is through through the through the set for individual
runtime datastage administrator output page input via the
column dierctor client mapping for input of output
progation transformer page collumn tab
enabled for a stages on most stages
project
4 Two new Select old Select new Select new Select click table
colmns have table table table defination defination and
been added defination defination and click on click on "compare
to the table and click on and click on "find against" option
that is used "find where "find where dependency " and select new
in multiple used " used " option option table defination
data stage of option
in your
project.The
new table
defination
has been
imported to
data satge,
how can u
best
determine
which of
might be
affected due
to this
change.
5 which one of set the cannot be set the map for set the sql type
the folowing stage applied the collumn for the collumn
will prevent mappe to that contains that contains the
corruption NONE in the xml input xml input to
when appling each to NONE in the varchar in each
a datastage upstream input stage upstream stage
map to an stage and on the
xml output link of the
document xml output stage
6 when you must drag table use the
building the manually from the aggregation tab
select enter which repository to set up
statement in column to tree to aggregation, but
sql query fetch include them grouping must be
builder,which in query addeed manually
of the
folowing is
true ?
7 using the sort the partition the no action is partition and sort
aggregrator keys key required the keys
stage, what
action is
required
within the job
design to
organize
input record
on a group of
keys
8 The degree of job link stage project
parallelism
can NOT be
defined at
which of the
following
levels?
9 which is the EJB REST 2.0 TEST over HTTP SOAP Over HTTP
correct Services service
binding for
exprting a
datastage job
on a web
service
10 consider the jobs can be job must be jobs connot to jobs Cannot be
customer promoted compiled in promoted using promoted from
with multiple using dsx the packages one env. To
enviroments, enviromnt
development import and they will run created with another as
, test and export in ISTOOl source file
production
which
statmennt is
true ?
11 which of the they are they are they arre saved they are save
following saved in saved in in the shared shared repo. And
statement is shared shared repository and availalble
true for table repository reposiory and can never be automatically
defination and can be available made available with other
create with made automatically to the other datastage
datstage available to to otther IIS information projects within
other IIS component server the IIS server
components components domain

12 Which stage isd input isd output odbc connector websphere MQ


returns stage stage stage connector stage
processing
data to a web
service
request?
13 you have a data sort data datatype used data in each data in
long running order in partiones in in the parallel parition is each
paralllel job each the parllel frame work are persisited to a parition
that takes 8 partition is frame work connected to single table in a is
hours to presisted are persisted sql types repository tier persisit
complete, ed to a
your task is separat
to break the e table
job into in a
multiple reposito
smaller job, ry tier
data will be
handed over
from one job
to the next
one. Using
data set to
transfer the
data between
jobs has
which of the
folowing
adavantages
? chose 2
14 what is the there is no its is env envirment it is env var. that
purpose of such env. variable that varaiblke that is concatetnate
$apt_import_ Varaible imports all imports all the files that
handle_short fields with recordsdo not mach a file
env. Varaible correct contain all the patern before
special field in the importing the
charcter import schema files

15 which of the the when sending when sending Restoring


folowing is deployemen the artifacts the assest to involves replacing
true about t tool does to the source the SC- the asset in the
submiting not need to control workspace, the meta data
and restoring be installed system the deployement repository. The
assest to a on the client deployment tool writes the deployment tool
source code windows to submit the asssest dierctly creates a file
control machin artifacts to to SC- which you then
system the source Workspace manually import
control with is tool
repository
16
17
18 which of the it is not it can only be it can be used it can be used to
follwing available used to to create create encrypted
about the prior to encrypt encrypted credential file
genkey.sh installation userid, not response file entries
orgenkey.bat os IS suite passord entries
command is
true?
19 which of the an amazon an osh when you are when you are
folwing is s3 secret schema file importing importing
true about key is not describing the metadata for metadata for
importing required format of the sequential file sequential file
table when data file must store in Amz s3 store in Amz s3
defenation importing be in the buckets you can buckets you can
for Amazon meta data same folder use iNFosphere use the seqntial
s3 files for file in as the data meta data file defination
amz s3 file assist manager process in
buckets to do import datastage
designer
20
21 which is the RCP should rcp should IIS datastage IIS datastage
folowing is never used alaways used balanced balanced
true about with the with the sort optimazation optimazation tool
RCP ? sort stage stage tool does does NOT
support RCP support RCP
22 what best to provide a administer manage and assign the reset
describe the method to the diertory monitor the information the
funcationaliti encrypt user permission of active server suite passowr
es of credential datastage information adminitsrator d of the
information server sessions role to a user user for
server the
dierctoryadm informa
in command tion
choouse 2 server
internal
user
registry
23 which of the IBM the stages files must be on non biginsight
folowing is infosphere involves extracted for system such as
Not true BIGinsight OOZIE jobs the OOZie cloudera, you
about OOZIE must be from the Client Api to must configure
workflow installed command run The invoker OOZIe before you
activity stage lines utility can configure the
in linux OOZie Workflow
Activity Stage

24 you can stage project record parameter


specify
message
handler use
at different
level . Which
of the
foloowing is
valid level to
set a
message
handler
25
26 while loading insert delete update user defined SQl
the data into
a table using
netezza
connector for
enable record
(ordering,
earasing )
option will be
enabled
when which
write mode is
selected ?
27 in you paralel replace the adjuct restrict specfy only the set stable sort minimis
job design sort memory ed the
,you have usage number
used two sort
stage and
one join
stage assume
the two sort
stage support
the input link
requirement
for config. Of
the join stage
, which
should be
conside to
improve sort
performance
in this job
28
29 when you use The big data the big data the big data file the big data file
balanced file stage in file stage in in the in the optimised
optimizer to the optimised job optimised job job becomes File
optimize a optimized is configured becomes a connector stage
job to write job is to acesss hdfs mapp reduce that acecesses
to files on a configured using JAQL stage that acess hdfs using JAQl
haddop to access queries hdfs using JAQL QUEIRES
distributed hdfs using queries
file system hive quries
HDFS you
target a big
data file
stage, how
does the
design
change in the
optimal job
30 which parallel and purely parallel jobs Purely numric
statement is server jobs numeric data use two types data types and
true about use the types never of Character certain types of
mapping same locale require sets, server jobs string data
datatypes mapping use one type of (Character data)
when using Character set need mapping
NLS in
parallel job
31 which of the local HDFS HttpFS WebHDFS
following file
system
property is
NOT available
in the file
connector
stage
32 which of the data rule data rules can data rules can data rules
folowing is can be be both be both created cannoted be
true about created created and and published created withing
the creation within the published within data rule the data rule
of data rules data rules within the stage after stage, they must
with the data stage, but data rule creation the first created in IA
rule stage ? they cannot stage, but the rules can be before they can
be rule can only runned in both be used in data
published be run within DS and IA rule stage
within the datastage
stage
33 Which in auto hash Same round robin
Partitioning
method in
the default
method
chossen by
the data
stage enigne
when
processing
data for the
join stage
34 Which of the split vector make record Column export Promote record Combin
following are e
valued subreco
restructure rd
stages?
(Choose two)
35 which of the they can they are they may begin they can conatin
statement contain limited to 31 with the alphanumeric
about shared underscores charchter numeric chacter
conatiner chacter
names is true
36 which of the process one process zero process records process record
option is not record from recordds from first link from link based
available in a each link from each then second on specified
db2 link link and so on cloumn sort
connector order
stage to
manage
order of data

37 Which in tool Director Workload Opertions Performance


would track Manager Console monitor
your
datastage
services load,
and ensure
that newier
job will not
run on a busy
system
38- what option sort order sort key create key create cluster key
104 would you mode change column change collumn
use to reuse
the sort key
when using
sort stage?
39 you are use lookup use a joi stage use a lookup use lookup stage
reading very stage select hash partion stage, select , select same
large dataset auto and sort both auto partioning for the
hash partioning input link by partitioning for stream link and
partioned on in both customer id the stream link hash partrioning
custerid in a input links column and entire by customer id
clustered processing for for refrence link
envirnoment the refrencce
you need to link
join the
dataset with
1 gb of
refrence data
on the same
customer id
which
techniquie is
best to use
40
41 the data auto ordered round robin sorted merge
going into
target
sequential
file stage is
sortedin each
partiition by
the data
field, which
collection
algorithm
should you
use in the
sequential
file stage to
get the
fastest
performance
42 new opertors general,file, input, general,partitio genral,partitioner
can be collector partitioner,co ner,collector ,error handling
defined by llector
deriving your
own classes
from the
parallel job
c++ class
library.what
are the
operator
classes you
can derive
from the c++
class library
43 which $APT_DUM $apt_record_ $env_record_c $osh_record_cou
environment P_SCORE count ount nt
varaible
provides the
number od
records
consumed
and produced
by each stage
?
44 which there is no you must increasing you must weigh
statement is upper limit define 2 parrallelism the grains of
true about to the logical nodes doesnot add to added
factors that benefits of for each the overhaed of parrallelism
affect the parallelism physical node the job against potential
opiitimal losses in
degree of processing
parallelism effeciency
45 which of the Stores data preserve represent linked to a
following is in binary partitioning persistent data particular
NOT a configuration file
Charachtersti
c of FILE SET
Stage?
46 which scd there can scd stages the scd stage scd stage is a
staement is only be one supports provide processing stage
true scd stage type1, typ2,3 purpose code that works within
per job processing to support the context of a
dimension normalised
processing databases
47 in the raw date string intteger timesta
scehma mp
format,
which
coloumn type
can be
defined as
varaible
length
chose2
48 during run $osh_dump $apt_dump_s $apt_startup_st $apt_performanc
time what core atus e_data
env. Varaible
will report
the
conductor
section
leader will
run indidual
playes
49 which sequence parameter property values trrigerris are
statement job can be are not of one activity available as soon
about restarted supported in cannot be as you add an
sequence job sequence passed to the activity stage to
is true ? jobs next activity sequence
50 WHICH OF two ebcdic and fixed OCCURD the ablity to
THE redefined adcii new with no select subsets of
FOLOWING clauses data depending on columns from the
CANNOT BE clause cobol file
HANDLED BY description (CfD)
COMPLEX
FLAT FILE
STAGE
51 which of the add a basic remove the set the env. set the env.
folowing is transformer environemnt Varaible Varaible
the valid stage into varaibale $apt_disable_c $apt_execution_
option to run the job $apt_config_f ombinataion to mode to
datastage job design ile from the true one_process
in sequention job
execution
mode?

52 which of the \handle concatenate modify a data convert string to


folowing nulls string from field by adding all uppercasee/
transformatio the multiple an integer to it lowercase
n cannot be fields
implemented
by the modify
stage?
53 what is the ./home/user .home/user/ft .home/user/pft .home/user/pftp
location of /ftp p_50 p_50 _jobid_50
transfer
information
for linux FTP
job with job
id 50 and
checkpoint
dierctory
home/user?
54 to manage $DSIPC_POE $APT_PM_NO $DS_OSH_WRA $APT_PM_COND
the timeout N_TIMEOUT DE_TIMEOUT PPER_TIMEOUT UCTOR_TIMEOUT
between the
section
leader and
the players,
which of the
folowing
environment
varaibles
should be set
?
55 which DB2 DRD oracle ODBC connector
connector connector connector connector
stage dosen't
have enable
partition read
property?
56 what is the when name name node name node name node pools
pupose of node pools pool can pools designated the
using named are used , constrain the designated the set of nodes
node pools ? datastage execution of resourcses that thatjob can be
usees stages to a can be used for used when it run
named selected a slected group in a Grid Env.
pipes of nodes
between group of
stages nodes

57 which of the performanc it is invoked performance performance


following is e data is fromwithin data is invoked data for
true about seprated by the adminster in realtime as combined
performance partitions client the jobs operators and
analyser execution inserted
operators cannot
be viewed
58 which loop nested the the notification
statement condition condition can execcommand stages cannot
about are set in have multiple stage alowws support multiple
sequence end loop input triggers you to run a address
activity stages external scripts
stages is true
59 which of the autenticate the istool the script mode for unix or linux
statementab action command can .you do not the istool
out ISTOOL parameters only be run in need to include command
utility is true are optional command the istool string framework is
to the ISTool line or script on each located in
command mode command line instalation_dierct
ory/client/isttol
60
61 Which Job Job Job Job pararrameter
statement pararramete pararrameter pararrameter that are defined
accurrately r that are that are that are defined in the job are
describes the defined in defined in the in the job are sepecific to the
scope of the the job are seq job are stored job and are
datastage Job stored stored internally in stored internally
parameters? internally in internally in datastage and in datastage for
datastage datastage and acessible to all the duration of
and acessible only jobs that are in the job and are
acessible to to first job the currently not accessible
other jobs activity stage running via outside of that
that are in in the seq. job datstage api call job
the same
sequence
job
62 your job hash hash lokup random repatable
copies data replacement replacement
from several
production
tables to
correspondin
g tables in a
test env. The
data masking
stage has
been used to
mask the
production
data. Which
masking
policy is used
in the
masking
stage to
preserve the
refrential
integrity of
the tables in
the test env.
63 a job replace the increase the add the sort usse round robin
varaibles join with the number of stage in front of partioning on the
account merge stage nodes in the join stage, stream and entire
numbers with configuration sort by account partitioning on
the refrence file number refrence
file using a
join stage,
which is the
hash
partitioned
by account
number,
assuming
adequate
hardware
resourse
which action
can be used
to be
improve the
performance
of the jonb
64 the find and F/R Collumn F/r stage find the next
find/replace replace the Name varaible expression that
facility within expresssion contains an error
the text
transformer
stage can
NOT do
which of the
following

65 Which join Left Inner Full outer Write Outer


opration
keeps records
from input
data set
whose key
column
conatins
equval values
and drops
records
whose key
columns do
not contain
eqval value

Potrebbero piacerti anche