- Design and Analysis of Algorithms
- Home
- Basics of Algorithms
- DAA - Introduction
- DAA - Analysis of Algorithms
- DAA - Methodology of Analysis
- Asymptotic Notations & Apriori Analysis
- Time Complexity
- Master’s Theorem
- DAA - Space Complexities
- Divide & Conquer
- DAA - Divide & Conquer
- DAA - Max-Min Problem
- DAA - Merge Sort
- DAA - Binary Search
- Strassen’s Matrix Multiplication
- Karatsuba Algorithm
- Towers of Hanoi
- Greedy Algorithms
- DAA - Greedy Method
- Travelling Salesman Problem
- Prim's Minimal Spanning Tree
- Kruskal’s Minimal Spanning Tree
- Dijkstra’s Shortest Path Algorithm
- Map Colouring Algorithm
- DAA - Fractional Knapsack
- DAA - Job Sequencing with Deadline
- DAA - Optimal Merge Pattern
- Dynamic Programming
- DAA - Dynamic Programming
- Matrix Chain Multiplication
- Floyd Warshall Algorithm
- DAA - 0-1 Knapsack
- Longest Common Subsequence
- Travelling Salesman Problem | Dynamic Programming
- Randomized Algorithms
- Randomized Algorithms
- Randomized Quick Sort
- Karger’s Minimum Cut
- Fisher-Yates Shuffle
- Approximation Algorithms
- Approximation Algorithms
- Vertex Cover Problem
- Set Cover Problem
- Travelling Salesperson Approximation Algorithm
- Graph Theory
- DAA - Spanning Tree
- DAA - Shortest Paths
- DAA - Multistage Graph
- Optimal Cost Binary Search Trees
- Heap Algorithms
- DAA - Binary Heap
- DAA - Insert Method
- DAA - Heapify Method
- DAA - Extract Method
- Sorting Techniques
- DAA - Bubble Sort
- DAA - Insertion Sort
- DAA - Selection Sort
- DAA - Shell Sort
- DAA - Heap Sort
- DAA - Bucket Sort
- DAA - Counting Sort
- DAA - Radix Sort
- Searching Techniques
- Searching Techniques Introduction
- DAA - Linear Search
- DAA - Binary Search
- DAA - Interpolation Search
- DAA - Jump Search
- DAA - Exponential Search
- DAA - Fibonacci Search
- DAA - Sublist Search
- Complexity Theory
- Deterministic vs. Nondeterministic Computations
- DAA - Max Cliques
- DAA - Vertex Cover
- DAA - P and NP Class
- DAA - Cook’s Theorem
- NP Hard & NP-Complete Classes
- DAA - Hill Climbing Algorithm
- DAA Useful Resources
- DAA - Quick Guide
- DAA - Useful Resources
- DAA - Discussion

# Design and Analysis - Master’s Theorem

Master’s theorem is one of the many methods that are applied to calculate time complexities of algorithms. In analysis, time complexities are calculated to find out the best optimal logic of an algorithm. Master’s theorem is applied on recurrence relations.

But before we get deep into the master’s theorem, let us first revise what recurrence relations are −

Recurrence relations are equations that define the sequence of elements in which a term is a function of its preceding term. In algorithm analysis, the recurrence relations are usually formed when loops are present in an algorithm.

## Problem Statement

Master’s theorem can only be applied on decreasing and dividing recurring functions. If the relation is not decreasing or dividing, master’s theorem must not be applied.

### Master’s Theorem for Dividing Functions

Consider a relation of type −

T(n) = aT(n/b) + f(n)

where,** a >= 1** and **b > 1**,

**n** − size of the problem

**a** − number of sub-problems in the recursion

**n/b** − size of the sub problems based on the assumption that all sub-problems are of the same size.

**f(n)** − represents the cost of work done outside the recursion -> Θ(nk logn p) ,where k >= 0 and p is a real number;

If the recurrence relation is in the above given form, then there are three cases in the master theorem to determine the asymptotic notations −

If a > b

^{k}, then T(n)= Θ (n^{logb a}) [ log_{b}a = log a / log b. ]If a = b

^{k}If p > -1, then T(n) = Θ (n

^{logb a}log^{p+1}n)If p = -1, then T(n) = Θ (n

^{logb a}log log n)If p < -1, then T(n) = Θ (n

^{logb a})

If a < b

^{k},If p >= 0, then T(n) = Θ (n

^{k}log^{p}n).If p < 0, then T(n) = Θ (n

^{k})

### Master’s Theorem for Decreasing Functions

Consider a relation of type −

T(n) = aT(n-b) + f(n) where, a >= 1 and b > 1, f(n) is asymptotically positive

Here,

**n** − size of the problem

**a** − number of sub-problems in the recursion

**n-b** − size of the sub problems based on the assumption that all sub-problems are of the same size.

**f(n)** − represents the cost of work done outside the recursion -> Θ(n^{k}), where k >= 0.

If the recurrence relation is in the above given form, then there are three cases in the master theorem to determine the asymptotic notations −

if a = 1, T(n) = O (n

^{k+1})if a > 1, T(n) = O (a

^{n/b}* n^{k})if a < 1, T(n) = O (n

^{k})

## Examples

Few examples to apply master’s theorem on *dividing recurrence relations* −

### Example 1

Consider a recurrence relation given as T(n) = 8T(n/2) + n^{2}

In this problem, a = 8, b = 2 and f(n) = Θ(n^{k}log_{n}p) = n^{2}, giving us k = 2 and p = 0. a = 8 > b^{k}= 22 = 4, Hence, case 1 must be applied for this equation. To calculate, T(n) = Θ (n^{logb a}) = n^{log28}= n^{( log 8 / log 2 )}= n^{3}Therefore, T(n) = Θ(n^{3}) is the tight bound for this equation.

### Example 2

Consider a recurrence relation given as T(n) = 4T(n/2) + n^{2}

In this problem, a = 4, b = 2 and f(n) = Θ(n^{k}log_{n}p) = n^{2}, giving us k = 2 and p = 0. a = 4 = b^{k}= 2^{2}= 4, p > -1 Hence, case 2(i) must be applied for this equation. To calculate, T(n) = Θ (n^{logb a}log^{p+1}n) = n^{log}_{2}^{4}log^{0+1}n = n^{2}logn Therefore, T(n) = Θ(n^{2}logn) is the tight bound for this equation.

### Example 3

Consider a recurrence relation given as T(n) = 2T(n/2) + n/log n

In this problem, a = 2, b = 2 and f(n) = Θ(n^{k}log_{n}p) = n/log n, giving us k = 1 and p = -1. a = 2 = b^{k}= 2^{1}= 2, p = -1 Hence, case 2(ii) must be applied for this equation. To calculate, T(n) = Θ (n^{logb a}log log n) = n^{log}_{4}^{4}log logn = n.log(logn) Therefore, T(n) = Θ(n.log(logn)) is the tight bound for this equation.

### Example 4

Consider a recurrence relation given as T(n) = 16T(n/4) + n^{2}/log^{2}n

In this problem, a = 16, b = 4 and f(n) = Θ(n^{k}log_{n}p) = n^{2}/log^{2}n, giving us k = 2 and p = -2. a = 16 = b^{k}= 4^{2}= 16, p < -1 Hence, case 2(iii) must be applied for this equation. To calculate, T(n) = Θ (n^{logb a}) = n^{log}_{4}^{16}= n^{2}Therefore, T(n) = Θ(n^{2}) is the tight bound for this equation.

### Example 5

Consider a recurrence relation given as T(n) = 2T(n/2) + n^{2}

In this problem, a = 2, b = 2 and f(n) = Θ(n^{k}log_{n}p) = n^{2}, giving us k = 2 and p = 0. a = 2 < b^{k}= 2^{2}= 4, p = 0 Hence, case 3(i) must be applied for this equation. To calculate, T(n) = Θ (n^{k}log^{p}n) = n^{2}log^{0}n = n^{2}Therefore, T(n) = Θ(n^{2}) is the tight bound for this equation.

### Example 6

Consider a recurrence relation given as T(n) = 2T(n/2) + n^{3}/log n

In this problem, a = 2, b = 2 and f(n) = Θ(n^{k}log_{n}p) = n^{3}/log n, giving us k = 3 and p = -1. a = 2 < b^{k}= 2^{3}= 8, p < 0 Hence, case 3(ii) must be applied for this equation. To calculate, T(n) = Θ (n^{k}) = n^{3}= n^{3}Therefore, T(n) = Θ(n^{3}) is the tight bound for this equation.

Few examples to apply master’s theorem in *decreasing recurrence relations* −

### Example 1

Consider a recurrence relation given as T(n) = T(n-1) + n^{2}

In this problem, a = 1, b = 1 and f(n) = O(n^{k}) = n^{2}, giving us k = 2. Since a = 1, case 1 must be applied for this equation. To calculate, T(n) = O(n^{k+1}) = n^{2+1}= n^{3}Therefore, T(n) = O(n^{3}) is the tight bound for this equation.

### Example 2

Consider a recurrence relation given as T(n) = 2T(n-1) + n

In this problem, a = 2, b = 1 and f(n) = O(n^{k}) = n, giving us k = 1. Since a > 1, case 2 must be applied for this equation. To calculate, T(n) = O(a^{n/b}* n^{k}) = O(2^{n/1}* n^{1}) = O(n2^{n}) Therefore, T(n) = O(n2^{n}) is the tight bound for this equation.

### Example 3

Consider a recurrence relation given as T(n) = n^{4}

In this problem, a = 0 and f(n) = O(n^{k}) = n^{4}, giving us k = 4 Since a < 1, case 3 must be applied for this equation. To calculate, T(n) = O(n^{k}) = O(n^{4}) = O(n^{4}) Therefore, T(n) = O(n^{4}) is the tight bound for this equation.