Sei sulla pagina 1di 2

Date: ____/____/_______

Workshop Training [Hadoop MCQ]


Candidate Name__________________________________ Batch _____ Location ___________
Q1)  CDH is a __________ type of Hadoop Distribution.
a)Standalone b)Pseudo Distribution c)Full Distribution d)None of the mentioned

Q2)  Hadoop MapReduce Jobs can be written in Java and _____________.


a)C b)C# c)Python d)Ruby

Q3)__________ maps input key/value pairs to a set of intermediate key/value pair


a)Mapper b)Reducer c)Both Mapper & d)None of the mentioned
Reducer

Q4) A ________ serves as the master and there is only one per cluster.
a)Data Node b)Name Node c)Data Block d)Replication

Q5) ________ NameNode is used when the Primary NameNode goes down
a)Rack b)Data c)Secondary d)None of the mentioned

Q6) Point out the wrong statement :


a) Replication Factor can b)Each Data Node send c) User data is stored on d) DataNode is aware of
be configured at a cluster heart beat/Block Report the local file system of the files to which the
level (Default is set to 3). to the Name Node DataNodes blocks stored on it belong
indicating that it is alive to

Q7) ________ is the architectural center of Hadoop 2 that allows multiple data processing engines.
a)YARN b)Hive c)Pig d)Job Tracker

Q8) The __________ is a framework-specific entity that negotiates resources from the ResourceManager.
a)NodeManager b)ResourceManager c)ApplicationMaster d)All of the mentioned

Q9) Apache Hadoop YARN stands for :


a)Yet Another Reserve b)Yet Another Resource c)Yet Another Resource d)None of the mentioned
Negotiator Network Negotiator

Q10)What is the default data block size in Hadoop 2.x?


a)128MB b)64MB c)100MB d)None of the mentioned

Q11) In Hadoop 1.x Map & Reduce Task is Managed by_____________.


a)Application Master b)Task Tracker c)Resource Manager d)Job Tracker

Q12)How many blocks will be created for a file of size 200MB in Hadoop 1.x
a)2 b)3 c)4 d)5

1|Page
mtaeducation.in
Date: ____/____/_______
Q13) _________ tool can be used to import or export data to and from traditional RDBMS from Hadoop
a)scoop b)sqoop c)hive d)pig

Q14) Web Log or Social media “streaming” Data can be analysed using ________ tool.
a)Hive b)Pig c)Flume d)None of the mentioned

Q15) __________ is a Apache Hadoop UI & is a web application for querying and visualizing data by interacting
with Apache Hadoop.
a)MapReduce b)Flume c)Hue d)None of the mentioned

2|Page
mtaeducation.in

Potrebbero piacerti anche