Following quiz provides Multiple Choice Questions (MCQs) related to Sqoop. You will have to read all the given answers and click over the correct answer. If you are not sure about the answer then you can check the answer using Show Answer button. You can use Next Quiz button to check new set of questions in the quiz.
Q 1 - To run sqoop from multiple nodes, it has to be installed in
On installing in one node it, automatically gets replicated to other nodes in the cluster.
Q 2 - What are the two different incremental modes of importing data into sqoop?
The --incremental parameter is used to fetch only the new data (data which does not already exist in hadoop) . It is done as an append if there are columns specified to be checked for new data. it cal also use the last modified parameter which will use the last_updated_date column from the existing table to identify the new row.
Q 3 - The argument in a saved sqoop job can be altered at run time by using the option
For a saved job named 'job1' the --table parameter can be altered at run time by using the command below.
sqoop job --exec job1 -- --table-newtable.
Q 4 - Data Transfer using sqoop can be
The data can be both imported and exported form Hadoop system using sqoop.
Q 5 - Which of the following is a disadvantage of using the –staging-table parameter?
All the listed options are disadvantages while using the –staging-table option.
Q 6 - The –update-key parameter can
The –update-key parameter cannot export new rows which do not have a matching key in the already exported table.
Q 7 - If the table to which data is being exported has more columns than the data present in the hdfs file then
The load can still be done by specifying the –column parameter to populate a subset of columns in the relational table.
Q 8 - To ensure that the columns created in hive by sqoop have the correct data types the parameter used by sqoop is
The correct column mapping is handled by the parameter --map-column-hive.
Q 9 - The parameter that can create a hbase table using sqoop when importing data to hbase is
If the–create-hbase-table is mentioned during the import then the Hbase table can get created using sqoop if it does not already exist.
Q 10 - The tool that populates a Hive metastore with a definition for a table based on a database table previously imported to HDFS is
Define in Hive a table named emps with a definition based on a database table named employees −
$ sqoop create-hive-table --connect jdbc:mysql://db.example.com/corp \ --table employees --hive-table emps