- Related Questions & Answers
- Huffman Trees in Data Structure
- Height Limited Huffman Trees in Data Structure
- k-ary tree in Data Structure
- Huffman Coding Algorithm
- Huffman Codes and Entropy in Data Structure
- Splay trees in Data Structure
- Solid Trees in Data Structure
- Range Trees in Data Structure
- BSP Trees in Data Structure
- R-trees in Data Structure
- Interval Trees in Data Structure
- Segment Trees in Data Structure
- Tournament Trees, Winner Trees and Loser Trees in Data Structure
- Optimal Lopsided Trees in Data Structure
- Threaded Binary Trees in Data Structure

- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who

- A collection of n initial Huffman trees is prepared, each of which is a single leaf node. Keep the n trees onto a priority queue organized by weight (frequency).
- Remove or delete the first two trees (the ones with smallest weight). Combine these two trees to create a new tree whose root is associated with the two trees as children, and whose weight is the sum of the weights of the two children trees.
- Keep this new tree into the priority queue.
- Repeat steps 2-3 until and unless all of the partial Huffman trees have been joined into one.

It's a greedy algorithm: at each iteration, the algorithm creates a "greedy" decision to merge the two subtrees with smallest weight. Is it possible for algorithm to give the desired result?

- Lemma: Let x and y be the two lowest frequent characters. There is an optimal code tree in which x and y are siblings whose depth is minimum as any other leaf nodes in the tree.
- Theorem: Huffman codes are treated as optimal prefix-free binary codes (The greedy algorithm constructs the Huffman tree with the minimum external path weight for a given set of letters).

Advertisements