- Design and Analysis of Algorithms
- Home

- Basics of Algorithms
- DAA - Introduction
- DAA - Analysis of Algorithms
- DAA - Methodology of Analysis
- Asymptotic Notations & Apriori Analysis
- DAA - Space Complexities

- Design Strategies
- DAA - Divide & Conquer
- DAA - Max-Min Problem
- DAA - Merge Sort
- DAA - Binary Search
- Strassen’s Matrix Multiplication
- DAA - Greedy Method
- DAA - Fractional Knapsack
- DAA - Job Sequencing with Deadline
- DAA - Optimal Merge Pattern
- DAA - Dynamic Programming
- DAA - 0-1 Knapsack
- Longest Common Subsequence

- Graph Theory
- DAA - Spanning Tree
- DAA - Shortest Paths
- DAA - Multistage Graph
- Travelling Salesman Problem
- Optimal Cost Binary Search Trees

- Heap Algorithms
- DAA - Binary Heap
- DAA - Insert Method
- DAA - Heapify Method
- DAA - Extract Method

- Sorting Methods
- DAA - Bubble Sort
- DAA - Insertion Sort
- DAA - Selection Sort
- DAA - Quick Sort
- DAA - Radix Sort

- Complexity Theory
- Deterministic vs. Nondeterministic Computations
- DAA - Max Cliques
- DAA - Vertex Cover
- DAA - P and NP Class
- DAA - Cook’s Theorem
- NP Hard & NP-Complete Classes
- DAA - Hill Climbing Algorithm

- DAA Useful Resources
- DAA - Quick Guide
- DAA - Useful Resources
- DAA - Discussion

- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who

In designing of Algorithm, complexity analysis of an algorithm is an essential aspect. Mainly, algorithmic complexity is concerned about its performance, how fast or slow it works.

The complexity of an algorithm describes the efficiency of the algorithm in terms of the amount of the memory required to process the data and the processing time.

Complexity of an algorithm is analyzed in two perspectives: **Time** and **Space**.

It’s a function describing the amount of time required to run an algorithm in terms of the size of the input. "Time" can mean the number of memory accesses performed, the number of comparisons between integers, the number of times some inner loop is executed, or some other natural unit related to the amount of real time the algorithm will take.

It’s a function describing the amount of memory an algorithm takes in terms of the size of input to the algorithm. We often speak of "extra" memory needed, not counting the memory needed to store the input itself. Again, we use natural (but fixed-length) units to measure this.

Space complexity is sometimes ignored because the space used is minimal and/or obvious, however sometimes it becomes as important an issue as time.

Execution time of an algorithm depends on the instruction set, processor speed, disk I/O speed, etc. Hence, we estimate the efficiency of an algorithm asymptotically.

Time function of an algorithm is represented by **T(n)**, where **n** is the input size.

Different types of asymptotic notations are used to represent the complexity of an algorithm. Following asymptotic notations are used to calculate the running time complexity of an algorithm.

**O**− Big Oh**Ω**− Big omega**θ**− Big theta**o**− Little Oh**ω**− Little omega

‘O’ (Big Oh) is the most commonly used notation. A function ** f(n)** can be represented is the order of

$f(n)\leqslant c.g(n)$ for $n > n_{0}$ in all case

Hence, function ** g(n)** is an upper bound for function

Let us consider a given function, $f(n) = 4.n^3 + 10.n^2 + 5.n + 1$

Considering $g(n) = n^3$,

$f(n)\leqslant 5.g(n)$ for all the values of $n > 2$

Hence, the complexity of ** f(n)** can be represented as $O(g(n))$, i.e. $O(n^3)$

We say that $f(n) = \Omega (g(n))$ when there exists constant **c** that $f(n)\geqslant c.g(n)$ for all sufficiently large value of **n**. Here **n** is a positive integer. It means function ** g** is a lower bound for function

Let us consider a given function, $f(n) = 4.n^3 + 10.n^2 + 5.n + 1$.

Considering $g(n) = n^3$, $f(n)\geqslant 4.g(n)$ for all the values of $n > 0$.

Hence, the complexity of ** f(n)** can be represented as $\Omega (g(n))$, i.e. $\Omega (n^3)$

We say that $f(n) = \theta(g(n))$ when there exist constants **c _{1}** and

This means function **g** is a tight bound for function **f**.

Let us consider a given function, $f(n) = 4.n^3 + 10.n^2 + 5.n + 1$

Considering $g(n) = n^3$, $4.g(n) \leqslant f(n) \leqslant 5.g(n)$ for all the large values of **n**.

Hence, the complexity of ** f(n)** can be represented as $\theta (g(n))$, i.e. $\theta (n^3)$.

The asymptotic upper bound provided by **O-notation** may or may not be asymptotically tight. The bound $2.n^2 = O(n^2)$ is asymptotically tight, but the bound $2.n = O(n^2)$ is not.

We use **o-notation** to denote an upper bound that is not asymptotically tight.

We formally define ** o(g(n))** (little-oh of g of n) as the set

Intuitively, in the **o-notation**, the function ** f(n)** becomes insignificant relative to

$$\lim_{n \rightarrow \infty}\left(\frac{f(n)}{g(n)}\right) = 0$$

Let us consider the same function, $f(n) = 4.n^3 + 10.n^2 + 5.n + 1$

Considering $g(n) = n^{4}$,

$$\lim_{n \rightarrow \infty}\left(\frac{4.n^3 + 10.n^2 + 5.n + 1}{n^4}\right) = 0$$

Hence, the complexity of ** f(n)** can be represented as $o(g(n))$, i.e. $o(n^4)$.

We use **ω-notation** to denote a lower bound that is not asymptotically tight. Formally, however, we define ** ω(g(n))** (little-omega of g of n) as the set

For example, $\frac{n^2}{2} = \omega (n)$, but $\frac{n^2}{2} \neq \omega (n^2)$. The relation $f(n) = \omega (g(n))$ implies that the following limit exists

$$\lim_{n \rightarrow \infty}\left(\frac{f(n)}{g(n)}\right) = \infty$$

That is, ** f(n)** becomes arbitrarily large relative to

Let us consider same function, $f(n) = 4.n^3 + 10.n^2 + 5.n + 1$

Considering $g(n) = n^2$,

$$\lim_{n \rightarrow \infty}\left(\frac{4.n^3 + 10.n^2 + 5.n + 1}{n^2}\right) = \infty$$

Hence, the complexity of ** f(n)** can be represented as $o(g(n))$, i.e. $\omega (n^2)$.

Apriori analysis means, analysis is performed prior to running it on a specific system. This analysis is a stage where a function is defined using some theoretical model. Hence, we determine the time and space complexity of an algorithm by just looking at the algorithm rather than running it on a particular system with a different memory, processor, and compiler.

Apostiari analysis of an algorithm means we perform analysis of an algorithm only after running it on a system. It directly depends on the system and changes from system to system.

In an industry, we cannot perform Apostiari analysis as the software is generally made for an anonymous user, which runs it on a system different from those present in the industry.

In Apriori, it is the reason that we use asymptotic notations to determine time and space complexity as they change from computer to computer; however, asymptotically they are the same.

Advertisements