
- Design and Analysis of Algorithms
- Home
- Basics of Algorithms
- DAA - Introduction
- DAA - Analysis of Algorithms
- DAA - Methodology of Analysis
- Asymptotic Notations & Apriori Analysis
- Time Complexity
- Master’s Theorem
- DAA - Space Complexities
- Divide & Conquer
- DAA - Divide & Conquer
- DAA - Max-Min Problem
- DAA - Merge Sort
- DAA - Binary Search
- Strassen’s Matrix Multiplication
- Karatsuba Algorithm
- Towers of Hanoi
- Greedy Algorithms
- DAA - Greedy Method
- Travelling Salesman Problem
- Prim's Minimal Spanning Tree
- Kruskal’s Minimal Spanning Tree
- Dijkstra’s Shortest Path Algorithm
- Map Colouring Algorithm
- DAA - Fractional Knapsack
- DAA - Job Sequencing with Deadline
- DAA - Optimal Merge Pattern
- Dynamic Programming
- DAA - Dynamic Programming
- Matrix Chain Multiplication
- Floyd Warshall Algorithm
- DAA - 0-1 Knapsack
- Longest Common Subsequence
- Travelling Salesman Problem | Dynamic Programming
- Randomized Algorithms
- Randomized Algorithms
- Randomized Quick Sort
- Karger’s Minimum Cut
- Fisher-Yates Shuffle
- Approximation Algorithms
- Approximation Algorithms
- Vertex Cover Problem
- Set Cover Problem
- Travelling Salesperson Approximation Algorithm
- Graph Theory
- DAA - Spanning Tree
- DAA - Shortest Paths
- DAA - Multistage Graph
- Optimal Cost Binary Search Trees
- Heap Algorithms
- DAA - Binary Heap
- DAA - Insert Method
- DAA - Heapify Method
- DAA - Extract Method
- Sorting Techniques
- DAA - Bubble Sort
- DAA - Insertion Sort
- DAA - Selection Sort
- DAA - Shell Sort
- DAA - Heap Sort
- DAA - Bucket Sort
- DAA - Counting Sort
- DAA - Radix Sort
- Searching Techniques
- Searching Techniques Introduction
- DAA - Linear Search
- DAA - Binary Search
- DAA - Interpolation Search
- DAA - Jump Search
- DAA - Exponential Search
- DAA - Fibonacci Search
- DAA - Sublist Search
- Complexity Theory
- Deterministic vs. Nondeterministic Computations
- DAA - Max Cliques
- DAA - Vertex Cover
- DAA - P and NP Class
- DAA - Cook’s Theorem
- NP Hard & NP-Complete Classes
- DAA - Hill Climbing Algorithm
- DAA Useful Resources
- DAA - Quick Guide
- DAA - Useful Resources
- DAA - Discussion
Design and Analysis - Searching Techniques
In the previous section, we have discussed various Sorting Techniques and cases in which they can be used. However, the main idea behind performing sorting is to arrange the data in an orderly way, making it easier to search for any element within the sorted data.
Searching is a process of finding a particular record, which can be a single element or a small chunk, within a huge amount of data. The data can be in various forms: arrays, linked lists, trees, heaps, and graphs etc. With the increasing amount of data nowadays, there are multiple techniques to perform the searching operation.
Searching Techniques in Data Structures
Various searching techniques can be applied on the data structures to retrieve certain data. A search operation is said to be successful only if it returns the desired element or data; otherwise, the searching method is unsuccessful.
There are two categories these searching techniques fall into. They are −
Sequential Searching
Interval Searching
Sequential Searching
As the name suggests, the sequential searching operation traverses through each element of the data sequentially to look for the desired data. The data need not be in a sorted manner for this type of search.
Example − Linear Search

Fig. 1: Linear Search Operation
Interval Searching
Unlike sequential searching, the interval searching operation requires the data to be in a sorted manner. This method usually searches the data in intervals; it could be done by either dividing the data into multiple sub-parts or jumping through the indices to search for an element.
Example − Binary Search, Jump Search etc.

Fig. 2: Binary Search Operation
Evaluating Searching Techniques
Usually, not all searching techniques are suitable for all types of data structures. In some cases, a sequential search is preferable while in other cases interval searching is preferable. Evaluation of these searching techniques is done by checking the running time taken by each searching method on a particular input.
This is where asymptotic notations come into the picture. To learn more about Asymptotic Notations, please click here.
To explain briefly, there are three different cases of time complexity in which a program can run. They are −
Best Case
Average Case
Worst Case
We mostly concentrate on the only best-case and worst-case time complexities, as the average case is difficult to compute. And since the running time is based on the amount of input given to the program, the worst-case time complexity best describes the performance of any algorithm.
For instance, the best case time complexity of a linear search is O(1) where the desired element is found in the first iteration; whereas the worst case time complexity is O(n) when the program traverses through all the elements and still does not find an element. This is labeled as an unsuccessful search. Therefore, the actual time complexity of a linear search is seen as O(n), where n is the number of elements present in the input data structure.
Many types of searching methods are used to search for data entries in various data structures. Some of them include −
Linear Search
Binary Search
Interpolation Search
Jump Search
Hash Table
Exponential Search
Sublist search
Fibonacci Search
Ubiquitous Binary Search
We will look at each of these searching methods elaborately in the following chapters.