What is Data Reconciliation?


Data reconciliation is represented as a phase of verification of records during data migration. In this phase, target data is compared with source information to provide that the migration structure is assigning data. Data validation and reconciliation define a technology that facilitates numerical models to process data.

An essential aspect in providing the quality of the information in business intelligence is the consistency of the information. Like a data warehouse, business intelligence combines and converts data and saves it so that it is made accessible for analysis and interpretation.

The consistency of the data among the several process steps has to be provided. Data reconciliation for DataSources enables us to provide the consistency of information that has been loaded into business intelligence and is accessible and used effectively there.

The phrase productive DataSource is used for DataSources that are used for data share in the productive service of business intelligence. The phrase data reconciliation DataSource is used for DataSources that are utilized as a reference for available the software data in the source precisely and thus enable us to draw comparisons to the source information.

  • Data Model − The data model is located on 3.x objects (data flow with transfer rules). The productive DataSource needs data transfer to produce the data that is to be endorsed to business intelligence. The transformation linked the DataSource area with the InfoObject of a DataStore object that has been generated for data reconciliation, using a direct assignment.

    The data reconciliation DataSource enables a VirtualProvider direct approach to the software data. In a MultiProvider, the data from the DataStore object is connected with the data that has been interpreted directly. In a query that is represented based on a MultiProvider, the loaded information can be compared with the software data in the source system.

  • Modelling Aspects − Data reconciliation for DataSources enables us to test the purity of the loaded information. It is comparing the amount of a key figure in the DataStore object with the equivalent totals that the VirtualProvider approach directly in the source system.

    It can utilize the extractor or extractor error analysis to recognize potential bugs in the data processing. This function is accessible if the data reconciliation DataSource uses a multiple extraction module to the productive DataSource.

    It can be used to maintain the volume of data transferred as small as applicable because the data reconciliation DataSource creates the data in the source system directly. This is best implemented using a data reconciliation DataSource produced by business intelligence content or a generic DataSource using function modules because this enables us to perform an aggregation logic.

Updated on: 23-Nov-2021

2K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements