This section presents you various set of Mock Tests related to Sqoop. You can download these sample mock tests at your local machine and solve offline at your convenience. Every mock test is supplied with a mock test key to let you verify the final score and grade yourself.
Q 1 - What is achieved by using the --meta-connect parameter in a sqoop command?
with the --meta-connect parameter the metastore starts running as a service with the default port 16000.Now this metastore service becomes accessible throughout the cluster.
Q 2 - The free-form query import feature in sqoop allows to import data from
With the The free form query we can write a sql query involving a join between 2 tables and mention it with --query parameter while importing. It is used in place of the --table parameter.
Q 3 - The clause 'WHERE $CONDITIONS' in the sql query specified to import data, serves the purpose of
The WHERE $CONDITION is used to split the result of the SQL query into multiple chunks.
Q 4 - The parameter to give a custom name to the mapreduce job running a sqoop import command is −
The --mapreduce-job-name is used to give a user chosen job name to the sqoop command so that it can be easliy distinguished from other jobs in the jobtracker UI.
Q 5 - While using a free-form query to import data, Sqoop finds that two columns from the joined tables have the same name. In this case the job
The job will fail as the mapreduce job creates java classes for each of the column names and two java classes cannot have the same name in the same mapreduce job.
Q 6 - The –boundary-query parameter is used to
Sqoop needs to find the minimum and maximum value of the column
specified in the --split-by parameter so that sqoop can partition data into multiple independent slices that will be transferred in a parallel manner.
Q 7 - In a table import the name of the mapreduce job
The name of the job is based on the name of the table which is being imported.
Q 8 - In the import involving join of two tables the if there are two columns with matching name between two tables then this conflict can be resolved by
We can create column aliases in the import query and the mapreduce job will refer to the column aliases, avoiding the conflict.
Q 9 - Data Transfer using sqoop can be
The data can be both imported and exported form Hadoop system using sqoop.
Q 10 - While importing data into Hadoop using sqoop the SQL SELCT clause is used. Similarly while exporting data form Hadoop the SQL clause used is
The INSERT statements are generated by sqoop to insert data into the relational tables.
Q 11 - While inserting data into Relational system from Hadoop using sqoop, the various table constraints present in the relational table must be
We must verify that the data being exported does not violate the constraints error.
Q 12 - The export and import of data between sqoop and relational system happens through which of the following programs?
The sqoop client only submits the command and oversees the completion or failure of the command. The Mapreduce job created will do the actual data transfer.
Q 13 - When does sqoop gather the metadata of the relational table into which it exports the data?
Every time a sqoop command is submitted, it verifies the metadata of the table before starting the export.
Q 14 - Sqoop’s default behavior while inserting rows into relational tables is
the default behavior is to insert one row at a time while it can be configured for bulk load.
Q 15 - Which parameter in sqoop is used for bulk data export to relational tables?
The –batch parameter uses the JDBC batch load capability to do bulk load.
Q 16 - What does the parameter “ Dsqoop.export.records.per.statemet=10” do in a sqoop export command?
The sqoop command submits the values form 10 records in each insert statement with this parameter.
Q 17 - The parameter which decided How many rows will be inserted per transaction in sqoop is
Dsqoop.export.statements.per.transaction decides how many rows will be inserted for transaction.
Q 18 - The insert query used to insert exported data into tables is generate by
The insert query is generated only by the sqoop command and processed as such without any further modification by any other driver.
Q 19 - When the “sqoop.export.records.per.statement” is set to two or more, the query created by sqoop has the SQL form of
many databases use statements in option (D) to process multiple rows in the insert statement.
Q 20 - What happens if the sqoop generated export query is not accepted by the database?
The export fails if the query is not accepted by the database.
Q 21 - Using the higher value for the parameter sqoop.export.statements.per.transaction will
In the scenario when the database requires table_level write lock, higher value of sqoop.export.statements.per.transaction will lock the table for a longer time and will decrease the performance.
Q 22 - The –staging-table parameter is used for
When you want to verify that indeed all the require data is successfully exported before loading the data to final table, use the parameter –staging-table.
Q 23 - With the –staging-table parameter, the data is moved from staging to final table
sqoop runs another mapreduce job to load the final table after the staging load completes successfully.
Q 24 - Which of the following is a disadvantage of using the –staging-table parameter?
All the listed options are disadvantages while using the –staging-table option.
Q 25 - Using the –staging-table parameter while loading data to relational tables the creation of staging table is done
The user has to ensure that the staging tab e is created and accessible by sqoop.
|Question Number||Answer Key|