
- SAP BODS Tutorial
- SAP BODS - Home
- SAP BO Data Services
- SAP BODS - Overview
- SAP BODS - Architecture
- SAP BODS - Data Services Designer
- SAP BODS Repository
- SAP BODS - Repository Overview
- Repository Creating & Updating
- Data Services Management Console
- SAP BODS - DSMC Modules
- SAP BODS - DS Designer Introduction
- SAP BODS - ETL Flow in DS Designer
- SAP BODS Datastores & Formats
- SAP BODS - Datastore Overview
- SAP BODS - Changing a Datastore
- SAP BODS - Memory Datastore
- SAP BODS - Linked Datastore
- SAP BODS - Adapter Datastore
- SAP BODS - File Formats
- COBOL Copybook File Format
- Extracting Data from DB Tables
- Data Extraction from Excel Workbook
- Data Flow & Work Flow
- SAP BODS - Dataflow Introduction
- BODS - Dataflow Changing Properties
- SAP BODS - Workflow Introduction
- SAP BODS - Creating Workflows
- SAP BODS Transforms
- SAP BODS - Transforms Types
- Adding Transform to a Dataflow
- SAP BODS - Query Transform
- SAP BODS Administration
- SAP BODS - Data Services Overview
- Creating Embedded Dataflow
- Debugging & Recovery Mechanism
- Data Assessment & Data Profiling
- SAP BODS - Tuning Techniques
- Multi-user Development
- BODS - Central vs Local Repository
- BODS - Central Repository Security
- Creating a Multi-user Environment
- SAP BODS Useful Resources
- SAP BODS - Questions Answers
- SAP BODS - Quick Guide
- SAP BODS - Useful Resources
- SAP BODS - Discussion
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
SAP BODS Online Quiz
Following quiz provides Multiple Choice Questions (MCQs) related to SAP BODS. You will have to read all the given answers and click over the correct answer. If you are not sure about the answer then you can check the answer using Show Answer button. You can use Next Quiz button to check new set of questions in the quiz.

Q 1 - Which of the following component is responsible for extraction, transformation and load of data using Data Service designer?
Answer : B
Explanation
Data flows extract, transform, and load data. Everything having to do with data, including reading sources, transforming data, and loading targets, occurs inside a data flow.
Q 2 - Which of the following is not a repository type in Data Services architecture?
Answer : C
Explanation
There are three type of repositories in Data Services −
- Local
- Central
- Profiler
Q 3 - When a variable is used multiple times within a job, it is always suggested to use local variable?
Answer : B
Explanation
When the variable will need to be used multiple times within a job, global variable is used.
Q 4 - Which of the following is used to check access server status and real time services at web application layer?
Answer : C
Q 5 - In a Data flow to extract or load the data directly, which of the following method can be used for XML file?
Answer : A
Q 6 - Which of the following transformation can be used for Column mapping from input to output schemas, assigning primary keys, etc.
Answer : C
Explanation
Query transformation is most common transformation used in Data Services and you can perform the following functions −
Data filtering from sources
Joining data from multiple sources
Perform functions and transformations on data
Column mapping from input to output schemas
Assigning Primary keys
Add new columns, schemas and functions resulted to output schemas
As Query transformation is the most commonly used transformation, a shortcut is provided for this query in the tool palette.
Q 7 - A data flow data flow which is called from another data flow in the design is known as?
Answer : C
Explanation
Embedded data flow is known as data flows, which are called from another data flow in the design. The embedded data flow can contain multiple number of source and targets but only one input or output pass data to main data flow.
Q 8 - Which of the following recovery mechanism allows you to rerun the jobs without previous partial rerun?
Answer : B
Explanation
Automatic Recovery − This allows you to run unsuccessful jobs in recovery mode.
Manually Recovery − This allows you to rerun the jobs without considering partial rerun previous time.
Q 9 - As per recommendation for naming conventions, what should be prefix with data store?
Answer : A
Q 10 - Why do we use checkpoints in an ETL process?
A - To provide markers for data has been processed in case error occurs.
B - To create multiple streams from a single stream in a data flow.
Answer : A
Explanation
Checkpoints are used to provide markers for data that has been processed in case error occurs during ETL process.