- Design and Analysis of Algorithms
- Home

- Basics of Algorithms
- DAA - Introduction
- DAA - Analysis of Algorithms
- DAA - Methodology of Analysis
- Asymptotic Notations & Apriori Analysis
- DAA - Space Complexities

- Design Strategies
- DAA - Divide & Conquer
- DAA - Max-Min Problem
- DAA - Merge Sort
- DAA - Binary Search
- Strassen’s Matrix Multiplication
- DAA - Greedy Method
- DAA - Fractional Knapsack
- DAA - Job Sequencing with Deadline
- DAA - Optimal Merge Pattern
- DAA - Dynamic Programming
- DAA - 0-1 Knapsack
- Longest Common Subsequence

- Graph Theory
- DAA - Spanning Tree
- DAA - Shortest Paths
- DAA - Multistage Graph
- Travelling Salesman Problem
- Optimal Cost Binary Search Trees

- Heap Algorithms
- DAA - Binary Heap
- DAA - Insert Method
- DAA - Heapify Method
- DAA - Extract Method

- Sorting Methods
- DAA - Bubble Sort
- DAA - Insertion Sort
- DAA - Selection Sort
- DAA - Quick Sort
- DAA - Radix Sort

- Complexity Theory
- Deterministic vs. Nondeterministic Computations
- DAA - Max Cliques
- DAA - Vertex Cover
- DAA - P and NP Class
- DAA - Cook’s Theorem
- NP Hard & NP-Complete Classes
- DAA - Hill Climbing Algorithm

- DAA Useful Resources
- DAA - Quick Guide
- DAA - Useful Resources
- DAA - Discussion

- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who

# Design and Analysis Quick Sort

It is used on the principle of divide-and-conquer. Quick sort is an algorithm of choice in many situations as it is not difficult to implement. It is a good general purpose sort and it consumes relatively fewer resources during execution.

## Advantages

It is in-place since it uses only a small auxiliary stack.

It requires only

time to sort*n (log n)***n**items.It has an extremely short inner loop.

This algorithm has been subjected to a thorough mathematical analysis, a very precise statement can be made about performance issues.

## Disadvantages

It is recursive. Especially, if recursion is not available, the implementation is extremely complicated.

It requires quadratic (i.e., n2) time in the worst-case.

It is fragile, i.e. a simple mistake in the implementation can go unnoticed and cause it to perform badly.

Quick sort works by partitioning a given array ** A[p ... r]** into two non-empty sub array

**and**

*A[p ... q]***such that every key in**

*A[q+1 ... r]***is less than or equal to every key in**

*A[p ... q]***.**

*A[q+1 ... r]*Then, the two sub-arrays are sorted by recursive calls to Quick sort. The exact position of the partition depends on the given array and index ** q** is computed as a part of the partitioning procedure.

Algorithm: Quick-Sort (A, p, r)if p < r then q Partition (A, p, r) Quick-Sort (A, p, q) Quick-Sort (A, q + r, r)

Note that to sort the entire array, the initial call should be *Quick-Sort (A, 1, length[A])*

As a first step, Quick Sort chooses one of the items in the array to be sorted as pivot. Then, the array is partitioned on either side of the pivot. Elements that are less than or equal to pivot will move towards the left, while the elements that are greater than or equal to pivot will move towards the right.

## Partitioning the Array

Partitioning procedure rearranges the sub-arrays in-place.

Function: Partition (A, p, r)x ← A[p] i ← p-1 j ← r+1 while TRUE do Repeat j ← j - 1 until A[j] ≤ x Repeat i← i+1 until A[i] ≥ x if i < j then exchange A[i] ↔ A[j] else return j

## Analysis

The worst case complexity of Quick-Sort algorithm is ** O(n^{2})**. However using this technique, in average cases generally we get the output in

**time.**

*O(n log n)*