Hadoop MCQ

Hadoop MCQ

  • Sharad Jaiswal
  • 22nd Nov, 2021

We have listed here the Best Hadoop MCQ Questions for your basic knowledge of Hadoop. This Hadoop MCQ Test contains 35+ Hadoop Multiple Choice Questions. You have to select the right answer to every question. This Hadoop MCQ Quiz covers the important topics of Hadoop. for which, you can perform best in Hadoop MCQ Exams, Interviews, and Placement drives.

Practice Best Hadoop MCQ Questions.

1) The hadoop frame work is written in

  • A. C++
  • B.Python
  • C.Java
  • D.GO

2) What is the full form of HDFS?

  • A. Highly distributed file shell
  • B.Hadoop directed file system
  • C.Hadoop distributed file system
  • D.Highly distributed file system.

3) Which technologies is a document store database?

  • A. Cassandra
  • B.CouchDB
  • C.HBase
  • D.Hive

4) When a file in HDFS is deleted by a user

  • A. It is lost forever
  • B.It goes to trash if configured.
  • C.File sin HDFS cannot be deleted
  • D.It becomes hidden from the user but stays in the file system

5) Which of the following deal with small files issue?

  • A. Hadoop archives
  • B.HBase
  • C.Sequence files
  • D.All of the above

6) Which of the following feature overcomes this single point of failure?

  • A. HDFS federation
  • B.Erasure coding
  • C.High availability
  • D.None of these

7) Choose the right advantage of 3x replication schema in Hadoop?

  • A. Fault tolerance
  • B.Reliability
  • C.High availability
  • D.All of the above

8) Select the default size of distributed cache?

  • A. 10 GB
  • B.15 GB
  • C.13 GB
  • D.11 GB

9) InputFormat class calls the ________ function and computes splits for each file and then sends them to the jobtracker.

  • A. getSplits
  • B.gets
  • C.puts
  • D.None of these

10) Select the correct parameter to describes destination directory which would contain the archive ?

  • A. archiveName
  • B.destination
  • C.source
  • D.None of these

11) Select the upper limit for counters of a Map Reduce job?

  • A. ~50
  • B.~150
  • C.~5s
  • D.~15

12) Identify the incorrect statement?

  • A. All the znodes are prefixed using the default /hbase location
  • B.The znodes that you’ll most often see are the ones that coordinate operations like Region Assignment
  • C.ZooKeeper provides an interactive shell that allows you to explore the ZooKeeper state
  • D.All of the above

13) Hadoop is open source.

  • A. True only for Apache Hadoop
  • B.True only for Apache and Cloudera Hadoop.
  • C.Always True
  • D.Always False

14) Which one of the following stores data?

  • A. Master node
  • B.Name node
  • C.Data node
  • D.None of these

15) What do you mean by data locality feature in Hadoop?

  • A. Co-locate the data with the computing nodes.
  • B.Store the same data across multiple nodes.
  • C.Distribute the data across multiple nodes.
  • D.Relocate the data from one node to another.

16) Which of the following isn't a scheduler options available with YARN?

  • A. Fair scheduler
  • B.Capacity scheduler
  • C.Optimal Scheduler
  • D.FIFO scheduler

17) Job tracker runs on

  • A. Namenode
  • B.Secondary datanode
  • C.Datanode
  • D.Secondary namenode

18) What is the role of journal node?

  • A. Report the edit log information of the blocks in the data node.
  • B.Report the location of the blocks in a data node
  • C.Report the Schedules when the jobs are going to run
  • D.Report the activity of various components handled by resource manager

19) What is Hive used as?

  • A. MapReduce wrapper
  • B.Hadoop query engine
  • C.Hadoop SQL interface
  • D.All of the above

20) Choose the core component of Hadoop?

  • A. Map Reduce
  • B.HDFS
  • C.Both 1 and 2
  • D.None of these

21) Which of the following critical feature of big data?

  • A. Volume
  • B.Velocity
  • C.Variety
  • D.All of the above

22) Facebook Tackles Big Data With _______ based on Hadoop.

  • A. roject Bid
  • B.Project Prism
  • C.Prism
  • D.Project Data

23) Which of the following license is Hadoop distributed under ?

  • A. Middleware
  • B.Apache License 2.0
  • C.Shareware
  • D.Mozilla

24) Which of the following genres does Hadoop produce ?

  • A. Distributed file system
  • B.JSP
  • C.JAX-RS
  • D.Java Message Service

25) Hive also support custom extensions written in ___________ .

  • A. C
  • B.Java
  • C.C#
  • D.C++

26) What was hadoop written in?

  • A. Perl
  • B.Java (software platform)
  • C.Lua (programming language)
  • D.Java (programming language)

27) Which of the following is used for machine learning on hadoop?

  • A. Pig
  • B.Hive
  • C.HBase
  • D.Mahoot

28) What are the advantages of 3x replication schema in hadoop?

  • A. Reliability
  • B.Fault tolerance
  • C.High availability
  • D.All of the Above

29) Which of the following is true about hadoop high availability?

  • A. Hadoop High Availability feature supports only single Namenode within a Hadoop cluster.
  • B.Hadoop High Availability feature tackles the namenode failure problem for all the components in the hadoop stack.
  • C.Hadoop High Availability feature tackles the namenode failure problem only for the MapReduce component in the hadoop stack.
  • D.All of the Above

30) Which of the configuration file is used to control the hdfs replication factor?

  • A. core-site.xml
  • B.yarn-site.xml
  • C.hdfs-site.xml
  • D.mapred-site.xml

31) In HDFS put command is used to...........

  • A. Copy files from local file system to HDFS.
  • B.Copy files from from HDFS to local filesystem.
  • C.Copy files or directories from HDFS to local filesystem.
  • D.Copy files or directories from local file system to HDFS.

32) Which of the following is not a valid hadoop config file?

  • A. Masters
  • B.core-site.xml
  • C.hadoop-site.xml
  • D.mapred-site.xmk

33) Hadoop 2.0 allows live stream processing of Real time data.

  • A. True
  • B.False

34) fetchdt command is not used to check for various inconsistencies.

  • A. True
  • B.False

35) Hadoop do need specialised hardware to process the data.

  • A. True
  • B.False

Leave A Comment :

Valid name is required.

Valid name is required.

Valid email id is required.