Sqoop Online Quiz


Following quiz provides Multiple Choice Questions (MCQs) related to Sqoop. You will have to read all the given answers and click over the correct answer. If you are not sure about the answer then you can check the answer using Show Answer button. You can use Next Quiz button to check new set of questions in the quiz.

Questions and Answers

Q 1 - What are the two binary file formats supported by sqoop?

A - Avro & SequenceFile

B - Rcfile and SequenceFile

C - ORC file and RC file

D - Avro and RC file

Answer : A


These are the two binary file formats supported by Sqoop.

Q 2 - For some databases sqoop can to faster data transefr by using the parameter

A - --bulkload

B - --fastload

C - --dump

D - --direct

Answer : D


The direct mode delegates the data transferring capabilities to the native untilities provided by the database.

Answer : C


With the The free form query we can write a sql query involving a join between 2 tables and mention it with --query parameter while importing. It is used in place of the --table parameter.

Q 4 - In the import involving join of two tables the if there are two columns with matching name between two tables then this conflict can be resolved by

A - Using table aliases

B - Column aliases

C - First creating temporary tables form each table with different column names

D - Rename the columns in the source system and then import

Answer : B


We can create column aliases in the import query and the mapreduce job will refer to the column aliases, avoiding the conflict.

Q 5 - Using the –staging-table parameter while loading data to relational tables the creation of staging table is done

A - Automatically b sqoop

B - Automatically by database

C - User has to ensure it is created

D - Automatically created by a Hadoop process beyond sqoop

Answer : C


The user has to ensure that the staging tab e is created and accessible by sqoop.

Answer : A


only the columns other than in the –update-key parameter will be appear in the SET clause.

Answer : A


If there are columns whose value is mandatory and the HDFS file does not have it in the subset the load will fail.

Q 8 - The temporary location to which sqoop moves the data before loading into hive is specified by the parameter

A - --target-dir

B - --source-dir

C - --hive-dir

D - --sqoop-dir

Answer : A


The --target-dir parameter mentions the directory used for temporary staging the data before loading into the hive table.

Q 9 - The parameter used to identify the individual row in HBase while importing data to it using sqoop is

A - --hbase-row-key

B - --hbase-rowkey

C - --hbase-rowid

D - --hbase-row-id

Answer : A


the parameter --hbase-row-key is used in sqoop to identify each row in the HBase table.

Q 10 - The tool that populates a Hive metastore with a definition for a table based on a database table previously imported to HDFS is

A - create-hive-table

B - import-hive-metastore

C - create-hive-metastore

D - update-hive-metastore

Answer : B


Define in Hive a table named emps with a definition based on a database table named employees −

$ sqoop create-hive-table --connect jdbc:mysql://db.example.com/corp \
   --table employees --hive-table emps