Chapter 2, Building Abstractions with Data

Section - 2.3 Symbolic Data

Exercise 2.71


In huffmann encoding, we keep on combining a pair of leaves with minimum weights. Now in this case we shall first combine:

, then
, then
, then
, then
… … .

We can see above that, except for the first one, every combination combines a new word/symbol or results in a leaf. The first node however combines two words for frequency and .

Clearly the leaf that results from the last combination will correspond to the smallest number of bits because it will have least depth(=1). It follows that smallest number of bits we need = . Since this word is corresponding to the last combination, clearly it is the word with highest frequency i.e. .

Similarly leaf that results from first combination will correspond to the highest number of bits because it will have highest depth. It follows number of bits we need is bits.