Elements of Information Theory Model of Digital Communications System A Logarithmic Measure for Information Mutual Information Units of Information Self-Information News... Example Information Measure Calculation Mutual Information is Symmetrical Conditional Self-Information Average Mutual Information Average Self-Information - Entropy Binary Entropy Function Conditional Entropy Θέµατα Συστηµάτων Πολυµέσων 04-1
Model of Digital Communications System Digital Channel ( modem ) Compact Representation of Source Signal with Fidelity Digital Channel with Improved Error Characteristics (usually analog) Θέµατα Συστηµάτων Πολυµέσων 04-2
A Logarithmic Measure for Information Θέµατα Συστηµάτων Πολυµέσων 04-3
Mutual Information Θέµατα Συστηµάτων Πολυµέσων 04-4
Units of Information Θέµατα Συστηµάτων Πολυµέσων 04-5
Self-Information Θέµατα Συστηµάτων Πολυµέσων 04-6
News... Θέµατα Συστηµάτων Πολυµέσων 04-7
Example of Information Measure Calculation Θέµατα Συστηµάτων Πολυµέσων 04-8
Mutual Information is Symmetrical Θέµατα Συστηµάτων Πολυµέσων 04-9
Example: Mutual Information for Binary Input Binary Output Channel 1-p 0 0 0 p p 1 1 0 1 1-p 1 Θέµατα Συστηµάτων Πολυµέσων 04-10
Example (cont.) Θέµατα Συστηµάτων Πολυµέσων 04-11
Example (cont.) Θέµατα Συστηµάτων Πολυµέσων 04-12
Conditional Self-Information Θέµατα Συστηµάτων Πολυµέσων 04-13
Average Mutual Information Θέµατα Συστηµάτων Πολυµέσων 04-14
Average Self-Information - Entropy Θέµατα Συστηµάτων Πολυµέσων 04-15
Binary Entropy Function Θέµατα Συστηµάτων Πολυµέσων 04-16
Conditional Entropy Θέµατα Συστηµάτων Πολυµέσων 04-17
Example: Conditional Entropy and Average Mutual Information for Binary Input Binary Output Channel Θέµατα Συστηµάτων Πολυµέσων 04-18
Binary Input Binary Output Channel Θέµατα Συστηµάτων Πολυµέσων 04-19
Sumary (Self-) Information (of event x i ) I(x i ) = - log 2 P(x i ) (in bits) Entropy (of a source with alphabet X ) average self-information H(X) = - i P(x i ) log 2 P(x i ) (in bits) Θέµατα Συστηµάτων Πολυµέσων 04-20
Source Coding Coding for a Discrete Memoryless Source Fixed Length Code Words Variable-Length Code Words Source Coding Theorem Example: Variable Length Source Encoding Huffman Code Huffman Code for Pairs of Letters Run Length Coding Classification of Coding Techniques Other Coding/Compression Techniques Θέµατα Συστηµάτων Πολυµέσων 04-21
Coding for a Discrete Memoryless Source Memoryless: symbols are statistically independent seldom the case in practice Finite alphabet of symbols: x i, i = 1, 2,..., L with prob. P(x i ), i = 1, 2,..., L Entropy: H(X) = - i P(x i ) log 2 P(x i ) log 2 L (Average) Code Rate: R (bits/symbol) Efficiency of Code = H(X) / R Fixed Length Code Words R = log 2 L Θέµατα Συστηµάτων Πολυµέσων 04-22
Fixed Length Code Words R = log 2 L if L is a power of 2 R = log 2 L and if symbols equiprobable efficiency = 1 otherwise, efficiency can be improved by coding block of letters Θέµατα Συστηµάτων Πολυµέσων 04-23
Variable-Length Code Words Entropy Coding: find a code that assigns short code words to the most probable letters (symbols) and longer code words to the more infrequent ones then, on average, the representation of the source output is shorter example: Morse code (1800s): (Desirable) Properties of Codes uniquely decodable instantaneously decodable Prefix condition (for a code): there is no code word of length l < k that is identical to the first l bits of another code word of length k > l if a code possesses the prefix condition, it is instantaneously decodable Θέµατα Συστηµάτων Πολυµέσων 04-24
Source Coding Theorem Let X be the ensemble of letters from a Discrete Memoryless Source with finite entropy H(X). It is possible to construct a code that satisfies the prefix condition and has average length R that satisfies: H(X) < R < H(X) + 1 Θέµατα Συστηµάτων Πολυµέσων 04-25
Example: Variable Length Source Encoding Θέµατα Συστηµάτων Πολυµέσων 04-26
Example (cont.): Another Encoding for the Same Source Θέµατα Συστηµάτων Πολυµέσων 04-27
Huffman Code x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 The Huffman code is optimal Arithmetic Coding is another Entropy coding technique (like Huffman) Θέµατα Συστηµάτων Πολυµέσων 04-28
Huffman Code for Pairs of Letters Encoding Pairs of Letters (99%) Encoding Single Letters (97.9%) Θέµατα Συστηµάτων Πολυµέσων 04-29
Run Length Coding Code (long) sequences of symbols (instead of repeating the symbols) by the symbol and the number of occurrences need escape characters example: byte-stuffing! = escape character!! =! example of Run Length Coding text: coded text: savings of RLC C : 3/8 use a lower limit to apply ABCCCCCCCCDEFGGG ABC!8DEFGGG e.g., 4 consecutive identical symbols in this example Θέµατα Συστηµάτων Πολυµέσων 04-30
Classification of Coding Techniques Θέµατα Συστηµάτων Πολυµέσων 04-31