From Wikipedia, the free encyclopedia

[edit] Summary
| Description |
The picture is an example of Huffman coding. Colors make it clearer, but they are not necessary to understand it (according to Wikipedia's guidelines): probability is shown in red, binary code is shown in blue inside a yellow frame. For a more detailed description see below (I couldn't insert a table here).
|
| Source |
self-made
|
| Date |
18th May 2007
|
| Author |
Alessio Damato
|
Permission
(Reusing this image) |
see below
|
Description: Assume you have a source generating 4 different symbols {a1,a2,a3,a4} with probability {0.4;0.35;0.2;0.05}. Generate a binary tree from left to right taking the two less probable symbols, putting them together to form another equivalent symbol having a probability that equals the sum of the two symbols. Keep on doing it until you have just one symbol. Then read the tree backwards, from right to left, assigning different bits to different branches. The final Huffman code is:
| Symbol |
Code |
| a1 |
0 |
| a2 |
10 |
| a3 |
110 |
| a4 |
111 |
The standard way to represent a signal made of 4 symbols is by using 2 bits/symbol, but the entropy of the source is 1.73 bits/symbol. If this Huffman code is used to represent the signal, then the entropy is lowered to 1.83 bits/symbol; it is still far from the theoretical limit because the probabilities of the symbols are different from negative powers of two.
[edit] Licensing
|
I, the copyright holder of this work, hereby publish it under the following licenses:
You may select the license of your choice.
|
File history
Click on a date/time to view the file as it appeared at that time.
| Date/Time | Dimensions | User | Comment |
| current | 13:45, 4 December 2007 | 277×133 (13 KB) | Alejo2083 | |
| 14:59, 18 May 2007 | 277×133 (13 KB) | Alejo2083 | |
File links
The following pages on the English Wikipedia link to this file (pages on other projects are not listed):