Google Answers Logo
View Question
Q: Discrete Memory less Source ( Answered 4 out of 5 stars,   0 Comments )
Subject: Discrete Memory less Source
Category: Miscellaneous
Asked by: dee296-ga
List Price: $5.00
Posted: 04 Aug 2005 11:39 PDT
Expires: 03 Sep 2005 11:39 PDT
Question ID: 551733
I would like this by 9am est. tomorrow.  Please show all work thanks.

Consider a discrete memory-less source (DMS) with seven possible
symbols X1, X2,?, X7 having probabilities of occurrences as follows:

X1: 0.350
X2: 0.300
X3: 0.200
X4: 0.100
X5: 0.030
X6: 0.015
X7: 0.005

Use the Huffman coding procedure to find the optimum, uniquely
decodable, variable-length code associated with the set of symbols.

Note: In constructing the Huffman coding tree, assign the upper branch
or left side branch a 0 and the lower branch or right side branch a 1.

Then evaluate the entropy associated with the DMS from above....
Note: Note: log2 x = log10 x / log10 2
Subject: Re: Discrete Memory less Source
Answered By: maniac-ga on 04 Aug 2005 19:15 PDT
Rated:4 out of 5 stars
Hello Dee296,

With the values you provided and following the excellent guide at
the huffman coding I came up with is:
  X1: 0
  X2: 11
  X3: 101
  X4: 1001
  X5: 10001
  X6: 100001
  X7: 100000

I build up the tree staring with the least frequent values as follows:
Note: (A , B) refers to the one / zero bit values for the left / right
branches of the tree.

  (X6, X7), total frequency 0.020
  (X5, (X6 , X7)), total frequency 0.050
  (X4, (X5, (X6, X7))), total frequency 0.150
  (X3, (X4, (X5, (X6, X7)))), total frequency 0.350
at this point, there are three "trees" in the forest remaining, the
one above at frequency 0.350, X1 at 0.350 and X2 at 0.300. I chose to
match the big tree with X2 but matching X2 with X1 works as well. With
my choice, I get
  (X2, (X3, (X4, (X5, (X6, X7))))), total frequency 0.650
  ((X2, (X3, (X4, (X5, (X6, X7))))), X1), total frequency 1.000
Walking the tree (left is zero, right is one) provides the encodings listed above.

According to the source code referred to at
or more directly at
the calculation of entropy is
 sum (-p(i) * ln2 (p(i)))
where p(i) is the probability for each symbol (1 through 7 in this
case). Using that formula I get
  entropy = 2.128638214
which is close to the computed "average number of bits"
  sum ( p(i) * lenght of encoded symbol i)
  average bits = 2.22

I found this information using the search phrase
  huffman coding
and following the most promising references.

If this is some homework assignment - I strongly suggest you check the
result I provided carefully. The algorithm I referred to may not be
exactly the same as expected by your professor and I assumed the term
"upper branch" refers to the branch with higher probability and "lower
branch" refers to the branch with lower probability.

If you need some additional information or if you find the answer
unclear, please make a clarification request prior to your deadline so
I can provide some further explanation.

dee296-ga rated this answer:4 out of 5 stars and gave an additional tip of: $5.00
This was great I was able to use the references and logic to complete it.

There are no comments at this time.

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  

Google Home - Answers FAQ - Terms of Service - Privacy Policy