=========================================================================== CSC 310 Lecture 15 Summary Fall 2007 =========================================================================== Topics: - Channel capacity - Extensions of a channel - Codes for extensions - References: sections 9.5, 9.6, 9.7 (skipping Shannon's second theorem for now). ---------------- Channel capacity ---------------- Idea: - The equation I(X;Y) = H(X) - H(X|Y) can be interpreted as follows. H(X) is the entropy/uncertainty/surprise about the input symbol X. H(X|Y) is the uncertainty about the input symbol from the decoder's point of view: the decoder knows Y and must derive what the input symbol was. From this perspective, I(X;Y) is precisely the amount of information the channel gives the decoder about the input X: before seeing the output Y, the decoder's uncertainty about X is H(X); after, it's H(X|Y) = H(X) - I(X;Y). - We define C, the channel capacity, to be the maximum I(X;Y) over all input distributions p_i = Pr[X = a_i]. - We computed this for the BSC with erasure probability f to be 1-H_2(f). In tutorial, you saw the capacity of the Z channel. References: section 9.5 ----------------------- Extensions of a channel ----------------------- Idea: - The N-th extension of a channel corresponds to N independent uses of the channel grouped together. Foramlly, the input (ouput) alphabet consists of N-tuples of symbols from the original input (respectively, output) alphabet. - The N uses of the channel are independent, so the transition probabilities will be Q_{j_1..j_N|i_1..i_N} = Q_{j_1|i_1}*..*Q_{j_N|i_N} - All entropies and mutual information for the extended channel can be shown to be N times the respective values for the original channel. References: section 9.7 (only subsection "Extended channels"). -------------------- Codes for extensions -------------------- Definitions: A code for the N-th extension is a subset C of (A_X)^N. These are the only messages we ever transmit through the channel. The elements of C are called codewords. An (N,K) block code C has 2^K codewords. We encode K-tuples of symbols we want to transmit into N-tuples which are codewords in C. The rate of a code is R := log(|C|)/N. For an (N,K) block code, it's K/N. References: section 9.6 (for definitions).