The history of data models had three generations of DBMS −
The history timeline of databases is shown below −
File based systems came in 1960s and was widely used. It stores information and organize it into storage devices like a hard disk, a CD-ROM, USB, SSD, floppy disk, etc.
Relational Model introduced by E.F.Codd in 1969. The model stated that data will be represented in tuples. A relational model groups data into one or more tables. These tables are related to each other using common records.
Database like Dbase went on sale in 1980s. It was one of the first database management systems for microcomputers. Cecil Wayne Ratliff developed it.
In 1990s, centralized DBMS server was used. The period also witnessed the introduction of MS-Access.
In addition, users worked on Internet and data warehousing introduced.
NoSQL, Big Data came in 2008.
Big Data described large value of both the structured and unstructured data. This data is so large that traditional database cannot process it.
Hadoop and MongoDB launched in 2009.
Hadoop use distributed file system for storing big data, and MapReduce to process it. Hadoop excels in storing and processing of huge data of various formats such as arbitrary, semi-, unstructured, etc.
MongoDB is a cross-platform, document oriented database that provides, high performance, high availability, and easy scalability. It works works on the concept of collection and document.
It introduced in 2010 and is a database built on top of the HDFS. HBase provides fast lookups for larger tables.