
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Found 6705 Articles for Database

307 Views
There are various tools for the management of data warehouse quality are as follows −Quality DefinitionA definition and quantification of quality are given, as the fraction of performance over Expectance. It is used to define quality as the loss communicated to society from the time a product is consigned. The complete loss of society can be considered as the sum of the producer’s loss and the user’s loss.It is familiar that there is a tradeoff between the quality of a product or service and a production cost and that an organization should discover an equilibrium among these two parameters. If ... Read More

742 Views
Data Propagation is the allocation of data from one or more source data warehouses to another local access database, according to propagation rules. Data warehouses are required to manage large bulks of data every day. A data warehouse can start with a few information, and starts to increase day by day by constant sharing and receiving from multiple data sources.As data sharing advances, data warehouse management becomes a major problem. Database management is required to manage the corporate information more effectively and in multiple subsets, arranging and time frames. These data resources are required to be constantly updated and the ... Read More

50K+ Views
Query optimization is of great importance for the performance of a relational database, especially for the execution of complex SQL statements. A query optimizer decides the best methods for implementing each query.The query optimizer selects, for instance, whether or not to use indexes for a given query, and which join methods to use when joining multiple tables. These decisions have a tremendous effect on SQL performance, and query optimization is a key technology for every application, from operational Systems to data warehouse and analytical systems to content management systems.There is the various principle of Query Optimization are as follows −Understand ... Read More

4K+ Views
Bayesian classifiers are statistical classifiers. They can predict class membership probabilities, such as the probability that a given sample belongs to a particular class. Bayesian classifiers have also exhibited high accuracy and speed when applied to a large database.Once classes are defined, the system should infer rules that govern the classification, therefore the system should be able to find the description of each class. The descriptions should only refer to the predicting attributes of the training set so that only the positive examples should satisfy the description, not the negative examples. A rule is said to be correct if its ... Read More

2K+ Views
Data aggregation is a process in which data is gathered and represented in a summary form, for purposes including statistical analysis. It is a kind of information and data mining procedure where data is searched, gathered, and presented in a report-based, summarized format to achieve specific business objectives or processes and/or conduct human analysis.Data aggregation can be implemented manually or through specialized software. The objective of Aggregation is to get more data about specific teams based on specific variables including age, profession, or income.The data about such teams can then be used for website personalization to select content and advertising ... Read More

2K+ Views
Data reconciliation is represented as a phase of verification of records during data migration. In this phase, target data is compared with source information to provide that the migration structure is assigning data. Data validation and reconciliation define a technology that facilitates numerical models to process data.An essential aspect in providing the quality of the information in business intelligence is the consistency of the information. Like a data warehouse, business intelligence combines and converts data and saves it so that it is made accessible for analysis and interpretation.The consistency of the data among the several process steps has to be ... Read More

2K+ Views
The extraction method is immensely dependent on the source rule and also on the business requirement in the target data warehouse environment. The estimated bulk of the information to be extracted and the phase in the ETL procedure (original load or preservation of records) can also force the determination of how to extract, from a logical and a physical view. There are two types of extraction methods including Logical Extraction Methods and Physical Extraction Methods.Logical Extraction MethodsThere are two types of logical extraction are as follows −Full Extraction − The data is extracted entirely from the source system. Because this ... Read More

5K+ Views
Extraction is the service of extracting information from a source system for additional help in a data warehouse environment. It is the first procedure of the ETL process. After the extraction, this data can be changed and loaded into the data warehouse. The source systems for a data warehouse are usually transaction processing software. It is the source systems for a sales analysis data warehouse can be an order entry system that data all of the current order activities.Data extraction is where data is considered and moved through to fetch relevant information from data sources (such as database) in a ... Read More

18K+ Views
An artificial neural network is a system located on the services of biological neural networks. It is a simulation of a biological neural system. The characteristic of artificial neural networks is that there are multiple architectures, which consequently needed several methods of algorithms, but despite being a complex system, a neural network is nearly simple.These networks are among the unique signal-processing technologies in the director’s toolbox. The area is highly interdisciplinary, but this method will restrict the look to the engineering outlook.In engineering, neural networks deliver two important functions as pattern classifiers and as non-linear adaptive filters. An Artificial Neural ... Read More

16K+ Views
Pruning is the procedure that decreases the size of decision trees. It can decrease the risk of overfitting by defining the size of the tree or eliminating areas of the tree that support little power. Pruning supports by trimming the branches that follow anomalies in the training information because of noise or outliers and supports the original tree in a method that enhances the generalization efficiency of the tree.Various methods generally use statistical measures to delete the least reliable departments, frequently resulting in quicker classification and an improvement in the capability of the tree to properly classify independent test data.There ... Read More