
- SAP BO Data Services
- SAP BODS - Overview
- SAP BODS - Architecture
- SAP BODS - Data Services Designer
- SAP BODS Repository
- SAP BODS - Repository Overview
- Repository Creating & Updating
- Data Services Management Console
- SAP BODS - DSMC Modules
- SAP BODS - DS Designer Introduction
- SAP BODS - ETL Flow in DS Designer
- SAP BODS Datastores & Formats
- SAP BODS - Datastore Overview
- SAP BODS - Changing a Datastore
- SAP BODS - Memory Datastore
- SAP BODS - Linked Datastore
- SAP BODS - Adapter Datastore
- SAP BODS - File Formats
- COBOL Copybook File Format
- Extracting Data from DB Tables
- Data Extraction from Excel Workbook
- Data Flow & Work Flow
- SAP BODS - Dataflow Introduction
- BODS - Dataflow Changing Properties
- SAP BODS - Workflow Introduction
- SAP BODS - Creating Workflows
- SAP BODS Transforms
- SAP BODS - Transforms Types
- Adding Transform to a Dataflow
- SAP BODS - Query Transform
- SAP BODS Administration
- SAP BODS - Data Services Overview
- Creating Embedded Dataflow
- Debugging & Recovery Mechanism
- Data Assessment & Data Profiling
- SAP BODS - Tuning Techniques
- Multi-user Development
- BODS - Central vs Local Repository
- BODS - Central Repository Security
- Creating a Multi-user Environment
- SAP BODS Useful Resources
- SAP BODS - Questions Answers
- SAP BODS - Quick Guide
- SAP BODS - Useful Resources
- SAP BODS - Discussion
SAP BODS Online Quiz
Following quiz provides Multiple Choice Questions (MCQs) related to SAP BODS. You will have to read all the given answers and click over the correct answer. If you are not sure about the answer then you can check the answer using Show Answer button. You can use Next Quiz button to check new set of questions in the quiz.

Q 1 - When we create an ETL job in Data service designer, what is the correct hierarchy for the below objects?
A - Project, Data Flow, Work flow, Replication job
B - Project, Work flow, Data Flow, Replication job
Answer : D
Q 2 - Which of the following is responsible for holding user defined objects, metadata and transformation rules in Data services?
Answer : B
Explanation
Data Service repository stores all the user defined objects, meta data and transformation rules.
Q 3 - Which of the following tool of Data Services is used to manage user access and security feature?
Answer : A
Explanation
BODS depends on Central Management console (CMC) for user access and security feature. This is applicable to the 4.x version. In the previous version, it was done in Management Console.
Q 4 - To use ECC system as data source, what is the data store type?
Answer : A
Q 5 - SAP Business Objects Data Services provides an option to connect to the Mainframe interfaces?
Answer : D
Explanation
SAP Business Objects Data Services provides an option to connect to the Mainframe interfaces using Attunity Connector. Using Attunity, connect the Datastore to the list of sources given below −
- DB2 UDB for OS/390
- DB2 UDB for OS/400
- IMS/DB
- VSAM
- Adabas
- Flat Files on OS/390 and OS/400
Q 6 - Which of the following is not a transformation type under data integration?
Answer : E
Explanation
Data Integration transforms are used for data extraction, transform and load to DW system. It ensures data integrity and improves developer productivity.
- Data_Generator
- Data_Transfer
- Effective_Date
- Hierarchy_flattening
- Table_Comparision, etc.
Data Cleanse- Data quality transformation.
Q 7 - Which of the following is a type of embedded data flow?
Answer : D
Explanation
The following types of embedded data flows can be used −
One Input Embedded data flow is added at the end of dataflow.
One Output Embedded data flow is added at the beginning of a data flow.
No input or output Replicate an existing data flow.
Q 8 - Which of the following parameter allows you to maintain the history of all the changes made to an object. You can check all the previous versions and revert to the older versions?
Answer : C
Q 9 - As per recommendation for naming conventions, what should be prefix with data store?
Answer : A
Q 10 - Why do we use checkpoints in an ETL process?
A - To provide markers for data has been processed in case error occurs.
B - To create multiple streams from a single stream in a data flow.
Answer : A
Explanation
Checkpoints are used to provide markers for data that has been processed in case error occurs during ETL process.